Changeset 799 for trunk/docs


Ignore:
Timestamp:
Jan 31, 2016, 1:45:58 PM (4 years ago)
Author:
cito
Message:

Improve adaptation and add query_formatted() method

Also added more tests and documentation.

Location:
trunk/docs/contents
Files:
5 edited
1 copied

Legend:

Unmodified
Added
Removed
  • trunk/docs/contents/pg/adaptation.rst

    r797 r799  
    22=====================================
    33
    4 .. py:currentmodule:: pgdb
     4.. py:currentmodule:: pg
    55
    66Both PostgreSQL and Python have the concept of data types, but there
     
    1515Adaptation of parameters
    1616------------------------
    17 
    18 PyGreSQL knows how to adapt the common Python types to get a suitable
    19 representation of their values for PostgreSQL when you pass parameters
    20 to a query. For example::
    21 
    22     >>> con = pgdb.connect(...)
    23     >>> cur = con.cursor()
    24     >>> parameters = (144, 3.75, 'hello', None)
    25     >>> tuple(cur.execute('SELECT %s, %s, %s, %s', parameters).fetchone()
    26     (144, Decimal('3.75'), 'hello', None)
    27 
    28 This is the result we can expect, so obviously PyGreSQL has adapted the
    29 parameters and sent the following query to PostgreSQL:
    30 
    31 .. code-block:: sql
    32 
    33     SELECT 144, 3.75, 'hello', NULL
    34 
    35 Note the subtle, but important detail that even though the SQL string passed
    36 to :meth:`cur.execute` contains conversion specifications normally used in
    37 Python with the ``%`` operator for formatting strings, we didn't use the ``%``
    38 operator to format the parameters, but passed them as the second argument to
    39 :meth:`cur.execute`.  I.e. we **didn't** write the following::
    40 
    41 >>> tuple(cur.execute('SELECT %s, %s, %s, %s' % parameters).fetchone()
    42 
    43 If we had done this, PostgreSQL would have complained because the parameters
    44 were not adapted.  Particularly, there would be no quotes around the value
    45 ``'hello'``, so PostgreSQL would have interpreted this as a database column,
    46 which would have caused a :exc:`ProgrammingError`.  Also, the Python value
    47 ``None`` would have been included in the SQL command literally, instead of
    48 being converted to the SQL keyword ``NULL``, which would have been another
    49 reason for PostgreSQL to complain about our bad query:
    50 
    51 .. code-block:: sql
    52 
    53     SELECT 144, 3.75, hello, None
    54 
    55 Even worse, building queries with the use of the ``%`` operator makes us
    56 vulnerable to so called "SQL injection" exploits, where an attacker inserts
    57 malicious SQL statements into our queries that we never intended to be
    58 executed.  We could avoid this by carefully quoting and escaping the
    59 parameters, but this would be tedious and if we overlook something, our
    60 code will still be vulnerable.  So please don't do this.  This cannot be
    61 emphasized enough, because it is such a subtle difference and using the ``%``
    62 operator looks so natural:
     17When you use the higher level methods of the classic :mod:`pg` module like
     18:meth:`DB.insert()` or :meth:`DB.update()`, you don't need to care about
     19adaptation of parameters, since all of this is happening automatically behind
     20the scenes.  You only need to consider this issue when creating SQL commands
     21manually and sending them to the database using the :meth:`DB.query` method.
     22
     23Imagine you have created a user  login form that stores the login name as
     24*login* and the password as *passwd* and you now want to get the user
     25data for that user.  You may be tempted to execute a query like this::
     26
     27    >>> db = pg.DB(...)
     28    >>> sql = "SELECT * FROM user_table WHERE login = '%s' AND passwd = '%s'"
     29    >>> db.query(sql % (login, passwd)).getresult()[0]
     30
     31This seems to work at a first glance, but you will notice an error as soon as
     32you try to use a login name containing a single quote.  Even worse, this error
     33can be exploited through a so called "SQL injection", where an attacker inserts
     34malicious SQL statements into the query that you never intended to be executed.
     35For instance, with a login name something like ``' OR ''='`` the user could
     36easily log in and see the user data of another user in the database.
     37
     38One solution for this problem would be to clean your input from "dangerous"
     39characters like the single quote, but this is tedious and it is likely that
     40you overlook something or break the application e.g. for users with names
     41like "D'Arcy".  A better solution is to use the escaping functions provided
     42by PostgreSQL which are available as methods on the :class:`DB` object::
     43
     44    >>> login = "D'Arcy"
     45    >>> db.escape_string(login)
     46    "D''Arcy"
     47
     48As you see, :meth:`DB.escape_string` has doubled the single quote which is
     49the right thing to do in SQL.  However, there are better ways of passing
     50parameters to the query, without having to manually escape them.  If you
     51pass the parameters as positional arguments to :meth:`DB.query`, then
     52PyGreSQL will send them to the database separately, without the need for
     53quoting them inside the SQL command, and without the problems inherent with
     54that process.  In this case you must put placeholders of the form ``$1``,
     55``$2`` etc. in the SQL command in place of the parameters that should go there.
     56For instance::
     57
     58    >>> sql = "SELECT * FROM user_table WHERE login = $1 AND passwd = $2"
     59    >>> db.query(sql, login, passwd).getresult()[0]
     60
     61That's much better.  So please always keep the following warning in mind:
    6362
    6463.. warning::
     
    6766  the ``%`` operator.  Always pass the parameters separately.
    6867
    69 The good thing is that by letting PyGreSQL do the work for you, you can treat
    70 all your parameters equally and don't need to ponder where you need to put
    71 quotes or need to escape strings.  You can and should also always use the
    72 general ``%s`` specification instead of e.g. using ``%d`` for integers.
    73 Actually, to avoid mistakes and make it easier to insert parameters at more
    74 than one location, you can and should use named specifications, like this::
    75 
    76     >>> params = dict(greeting='Hello', name='HAL')
    77     >>> sql = """SELECT %(greeting)s || ', ' || %(name)s
    78     ...    || '. Do you read me, ' || %(name)s || '?'"""
    79     >>> cur.execute(sql, params).fetchone()[0]
    80     'Hello, HAL. Do you read me, HAL?'
    81 
    82 PyGreSQL does not only adapt the basic types like ``int``, ``float``,
    83 ``bool`` and ``str``, but also tries to make sense of Python lists and tuples.
     68If you like the ``%`` format specifications of Python better than the
     69placeholders used by PostgreSQL, there is still a way to use them, via the
     70:meth:`DB.query_formatted` method::
     71
     72    >>> sql = "SELECT * FROM user_table WHERE login = %s AND passwd = %s"
     73    >>> db.query_formatted(sql, (login, passwd)).getresult()[0]
     74
     75Note that we need to pass the parameters not as positional arguments here,
     76but as a single tuple.  Also note again that we did not use the ``%``
     77operator of Python to format the SQL string, we just used the ``%s`` format
     78specifications of Python and let PyGreSQL care about the formatting.
     79Even better, you can also pass the parameters as a dictionary if you use
     80the :meth:`DB.query_formatted` method::
     81
     82    >>> sql = """SELECT * FROM user_table
     83    ...     WHERE login = %(login)s AND passwd = %(passwd)s"""
     84    >>> parameters = dict(login=login, passwd=passwd)
     85    >>> db.query_formatted(sql, parameters).getresult()[0]
     86
     87Here is another example::
     88
     89    >>> sql = "SELECT 'Hello, ' || %s || '!'"
     90    >>> db.query_formatted(sql, (login,)).getresult()[0]
     91
     92You would think that the following even simpler example should work, too:
     93
     94    >>> sql = "SELECT %s"
     95    >>> db.query_formatted(sql, (login,)).getresult()[0]
     96    ProgrammingError: Could not determine data type of parameter $1
     97
     98The issue here is that :meth:`DB.query_formatted` by default still uses
     99PostgreSQL parameters, transforming the Python style ``%s`` placeholder
     100into a ``$1`` placeholder, and sending the login name separately from
     101the query.  In the query we looked at before, the concatenation with other
     102strings made it clear that it should be interpreted as a string. This simple
     103query however does not give PostgreSQL a clue what data type the ``$1``
     104placeholder stands for.
     105
     106This is different when you are embedding the login name directly into the
     107query instead of passing it as parameter to PostgreSQL.  You can achieve this
     108by setting the *inline* parameter of :meth:`DB.query_formatted`, like so::
     109
     110    >>> sql = "SELECT %s"
     111    >>> db.query_formatted(sql, (login,), inline=True).getresult()[0]
     112
     113Another way of making this query work while still sending the parameters
     114separately is to simply cast the parameter values::
     115
     116    >>> sql = "SELECT %s::text"
     117    >>> db.query_formatted(sql, (login,), inline=False).getresult()[0]
     118
     119In real world examples you will rarely have to cast your parameters like that,
     120since in an INSERT statement or a WHERE clause comparing the parameter to a
     121table column the data type will be clear from the context.
     122
     123When binding the parameters to a query, PyGreSQL does not only adapt the basic
     124types like ``int``, ``float``, ``bool`` and ``str``, but also tries to make
     125sense of Python lists and tuples.
    84126
    85127Lists are adapted as PostgreSQL arrays::
    86128
    87129   >>> params = dict(array=[[1, 2],[3, 4]])
    88    >>> cur.execute("SELECT %(array)s", params).fetchone()[0]
     130   >>> db.query_formatted("SELECT %(array)s::int[]", params).getresult()[0][0]
    89131   [[1, 2], [3, 4]]
    90132
    91 Note that the query gives the value back as Python lists again.  This
     133Note that again we only need to cast the array parameter or use inline
     134parameters because this simple query does not provide enough context.
     135Also note that the query gives the value back as Python lists again.  This
    92136is achieved by the typecasting mechanism explained in the next section.
    93 The query that was actually executed was this:
    94 
    95 .. code-block:: sql
    96 
    97     SELECT ARRAY[[1,2],[3,4]]
    98 
    99 Again, if we had inserted the list using the ``%`` operator without adaptation,
    100 the ``ARRAY`` keyword would have been missing in the query.
    101 
    102 Tuples are adapted as PostgreSQL composite types::
    103 
    104     >>> params = dict(record=('Bond', 'James'))
    105     >>> cur.execute("SELECT %(record)s", params).fetchone()[0]
    106     ('Bond', 'James')
    107 
    108 You can also use this feature with the ``IN`` syntax of SQL::
    109 
    110     >>> params = dict(what='needle', where=('needle', 'haystack'))
    111     >>> cur.execute("SELECT %(what)s IN %(where)s", params).fetchone()[0]
    112     True
    113 
    114 Sometimes a Python type can be ambiguous. For instance, you might want
    115 to insert a Python list not into an array column, but into a JSON column.
    116 Or you want to interpret a string as a date and insert it into a DATE column.
    117 In this case you can give PyGreSQL a hint by using :ref:`type_constructors`::
    118 
    119     >>> cur.execute("CREATE TABLE json_data (data json, created date)")
    120     >>> params = dict(
    121     ...     data=pgdb.Json([1, 2, 3]), created=pgdb.Date(2016, 1, 29))
    122     >>> sql = ("INSERT INTO json_data VALUES (%(data)s, %(created)s)")
    123     >>> cur.execute(sql, params)
    124     >>> cur.execute("SELECT * FROM json_data").fetchone()
    125     Row(data=[1, 2, 3], created='2016-01-29')
    126 
    127 Let's think of another example where we create a table with a composite
    128 type in PostgreSQL:
     137
     138Tuples are adapted as PostgreSQL composite types.  If you use inline paramters,
     139they can also be used with the ``IN`` syntax.
     140
     141Let's think of a more real world example again where we create a table with a
     142composite type in PostgreSQL:
    129143
    130144.. code-block:: sql
     
    152166inserted into the database and then read back as follows::
    153167
    154    >>> cur.execute("INSERT INTO on_hand VALUES (%(item)s, %(count)s)",
     168   >>> db.query_formatted("INSERT INTO on_hand VALUES (%(item)s, %(count)s)",
    155169   ...     dict(item=inventory_item('fuzzy dice', 42, 1.99), count=1000))
    156    >>> cur.execute("SELECT * FROM on_hand").fetchone()
     170   >>> db.query("SELECT * FROM on_hand").getresult()[0][0]
    157171   Row(item=inventory_item(name='fuzzy dice', supplier_id=42,
    158172           price=Decimal('1.99')), count=1000)
     173
     174The :meth:`DB.insert` method provides a simpler way to achieve the same::
     175
     176    >>> row = dict(item=inventory_item('fuzzy dice', 42, 1.99), count=1000)
     177    >>> db.insert('on_hand', row)
     178    {'count': 1000,  'item': inventory_item(name='fuzzy dice',
     179            supplier_id=42, price=Decimal('1.99'))}
    159180
    160181However, we may not want to use named tuples, but custom Python classes
     
    173194
    174195But when we try to insert an instance of this class in the same way, we
    175 will get an error::
    176 
    177    >>> cur.execute("INSERT INTO on_hand VALUES (%(item)s, %(count)s)",
    178    ...     dict(item=InventoryItem('fuzzy dice', 42, 1.99), count=1000))
    179    InterfaceError: Do not know how to adapt type <class 'InventoryItem'>
    180 
    181 While PyGreSQL knows how to adapt tuples, it does not know what to make out
    182 of our custom class.  To simply convert the object to a string using the
    183 ``str`` function is not a solution, since this yields a human readable string
    184 that is not useful for PostgreSQL.  However, it is possible to make such
    185 custom classes adapt themselves to PostgreSQL by adding a "magic" method
    186 with the name ``__pg_repr__``, like this::
    187 
    188   >>> class InventoryItem:
     196will get an error.  This is because PyGreSQL tries to pass the string
     197representation of the object as a parameter to PostgreSQL, but this is just a
     198human readable string and not useful for PostgreSQL to build a composite type.
     199However, it is possible to make such custom classes adapt themselves to
     200PostgreSQL by adding a "magic" method with the name ``__pg_str__``, like so::
     201
     202    >>> class InventoryItem:
    189203    ...
    190204    ...     ...
     
    194208    ...             self.name, self.supplier_id, self.price)
    195209    ...
    196     ...     def __pg_repr__(self):
     210    ...     def __pg_str__(self, typ):
    197211    ...         return (self.name, self.supplier_id, self.price)
    198212
    199213Now you can insert class instances the same way as you insert named tuples.
    200 
    201 Note that PyGreSQL adapts the result of ``__pg_repr__`` again if it is a
    202 tuple or a list.  Otherwise, it must be a properly escaped string.
     214You can even make these objects adapt to different types in different ways::
     215
     216    >>> class InventoryItem:
     217    ...
     218    ...     ...
     219    ...
     220    ...     def __pg_str__(self, typ):
     221    ...         if typ == 'text':
     222    ...             return str(self)
     223    ...        return (self.name, self.supplier_id, self.price)
     224    ...
     225    >>> db.query("ALTER TABLE on_hand ADD COLUMN remark varchar")
     226    >>> item=InventoryItem('fuzzy dice', 42, 1.99)
     227    >>> row = dict(item=item, remark=item, count=1000)
     228    >>> db.insert('on_hand', row)
     229    {'count': 1000, 'item': inventory_item(name='fuzzy dice',
     230        supplier_id=42, price=Decimal('1.99')),
     231        'remark': 'fuzzy dice (from 42, at $1.99)'}
     232
     233There is also another "magic" method ``__pg_repr__`` which does not take the
     234*typ* parameter.  That method is used instead of ``__pg_str__`` when passing
     235parameters inline.  You must be more careful when using ``__pg_repr__``,
     236because it must return a properly escaped string that can be put literally
     237inside the SQL.  The only exception is when you return a tuple or list,
     238because these will be adapted and properly escaped by PyGreSQL again.
    203239
    204240Typecasting to Python
     
    206242
    207243As you noticed, PyGreSQL automatically converted the PostgreSQL data to
    208 suitable Python objects when returning values via one of the "fetch" methods
    209 of a cursor.  This is done by the use of built-in typecast functions.
    210 
    211 If you want to use different typecast functions or add your own  if no
     244suitable Python objects when returning values via the :meth:`DB.get()`,
     245:meth:`Query.getresult()` and similar methods.  This is done by the use
     246of built-in typecast functions.
     247
     248If you want to use different typecast functions or add your own if no
    212249built-in typecast function is available, then this is possible using
    213 the :func:`set_typecast` function.  With the :func:`get_typecast` method
    214 you can check which function is currently set, and :func:`reset_typecast`
    215 allows you to reset the typecast function to its default.  If no typecast
    216 function is set, then PyGreSQL will return the raw strings from the database.
     250the :func:`set_typecast` function.  With the :func:`get_typecast` function
     251you can check which function is currently set.  If no typecast function
     252is set, then PyGreSQL will return the raw strings from the database.
    217253
    218254For instance, you will find that PyGreSQL uses the normal ``int`` function
    219255to cast PostgreSQL ``int4`` type values to Python::
    220256
    221     >>> pgdb.get_typecast('int4')
     257    >>> pg.get_typecast('int4')
    222258    int
    223259
    224 You can change this to return float values instead::
    225 
    226     >>> pgdb.set_typecast('int4', float)
    227     >>> con = pgdb.connect(...)
    228     >>> cur = con.cursor()
    229     >>> cur.execute('select 42::int4').fetchone()[0]
    230     42.0
    231 
    232 Note that the connections cache typecast functions, so you may need to
    233 reopen the database connection, or reset the cache of the connection to
    234 make this effective, using the following command::
    235 
    236    >>> con.type_cache.reset_typecast()
    237 
    238 The :class:`TypeCache` of the connection can also be used to change typecast
    239 functions locally for one database connection only.
    240 
    241 As a more useful example, we can create a typecast function that casts
    242 items of the composite type used as example in the previous section
    243 to instances of the corresponding Python class::
    244 
    245     >>> con.type_cache.reset_typecast()
    246     >>> cast_tuple = con.type_cache.get_typecast('inventory_item')
     260In the classic PyGreSQL module, the typecasting for these basic types is
     261always done internally by the C extension module for performance reasons.
     262We can set a different typecast function for ``int4``, but it will not
     263become effective, the C module continues to use its internal typecasting.
     264
     265However, we can add new typecast functions for the database types that are
     266not supported by the C modul. Fore example, we can create a typecast function
     267that casts items of the composite PostgreSQL type used as example in the
     268previous section to instances of the corresponding Python class.
     269
     270To do this, at first we get the default typecast function that PyGreSQL has
     271created for the current :class:`DB` connection.  This default function casts
     272composite types to named tuples, as we have seen in the section before.
     273We can grab it from the :attr:`DB.dbtypes` object as follows::
     274
     275    >>> cast_tuple = db.dbtypes.get_typecast('inventory_item')
     276
     277Now we can create a new typecast function that converts the tuple to
     278an instance of our custom class::
     279
    247280    >>> cast_item = lambda value: InventoryItem(*cast_tuple(value))
    248     >>> con.type_cache.set_typecast('inventory_item', cast_item)
    249     >>> str(cur.execute("SELECT * FROM on_hand").fetchone()[0])
     281
     282Finally, we set this typecast function, either globally with
     283:func:`set_typecast`, or locally for the current connection like this::
     284
     285    >>> db.dbtypes.set_typecast('inventory_item', cast_item)
     286
     287Now we can get instances of our custom class directly from the database::
     288
     289    >>> item = db.query("SELECT * FROM on_hand").getresult()[0][0]
     290    >>> str(item)
    250291    'fuzzy dice (from 42, at $1.99)'
    251292
    252 As you saw in the last section you, PyGreSQL also has a typecast function
    253 for JSON, which is the default JSON decoder from the standard library.
    254 Let's assume we want to use a slight variation of that decoder in which
    255 every integer in JSON is converted to a float in Python. This can be
    256 accomplished as follows::
    257 
    258     >>> from json import loads
    259     >>> cast_json = lambda v: loads(v, parse_int=float)
    260     >>> pgdb.set_typecast('json', cast_json)
    261     >>> cur.execute("SELECT data FROM json_data").fetchone()[0]
    262     [1.0, 2.0, 3.0]
    263 
    264 Note again that you may need to ``type_cache.reset_typecast()`` to make
    265 this effective.  Also note that the two types ``json`` and ``jsonb`` have
    266 their own typecast functions, so if you use ``jsonb`` instead of ``json``,
    267 you need to use this type name when setting the typecast function::
    268 
    269     >>> pgdb.set_typecast('jsonb', cast_json)
     293Note that some of the typecast functions used by the C module are configurable
     294with separate module level functions, such as :meth:`set_decimal`,
     295:meth:`set_bool` or :meth:`set_jsondecode`.  You need to use these instead of
     296:meth:`set_typecast` if you want to change the behavior of the C module.
     297
     298Also note that after changing global typecast functions with
     299:meth:`set_typecast`, you may need to run ``db.dbtypes.reset_typecast()``
     300to make these changes effective on connections that were already open.
  • trunk/docs/contents/pg/db_wrapper.rst

    r798 r799  
    452452    rows = db.query("update employees set phone=$2 where name=$1",
    453453         name, phone).getresult()[0][0]
     454
     455query_formatted -- execute a formatted SQL command string
     456---------------------------------------------------------
     457
     458.. method:: DB.query_formatted(command, parameters, [types], [inline])
     459
     460    Execute a formatted SQL command string
     461
     462    :param str command: SQL command
     463    :param parameters: the values of the parameters for the SQL command
     464    :type parameters: tuple, list or dict
     465    :param types: optionally, the types of the parameters
     466    :type types: tuple, list or dict
     467    :param bool inline: whether the parameters should be passed in the SQL
     468    :rtype: :class:`Query`, None
     469    :raises TypeError: bad argument type, or too many arguments
     470    :raises TypeError: invalid connection
     471    :raises ValueError: empty SQL query or lost connection
     472    :raises pg.ProgrammingError: error in query
     473    :raises pg.InternalError: error during query processing
     474
     475Similar to :meth:`DB.query`, but using Python format placeholders of the form
     476``%s`` or ``%(names)s`` instead of PostgreSQL placeholders of the form ``$1``.
     477The parameters must be passed as a tuple, list or dict.  You can also pass a
     478corresponding tuple, list or dict of database types in order to format the
     479parameters properly in case there is ambiguity.
     480
     481If you set *inline* to True, the parameters will be sent to the database
     482embedded in the SQL command, otherwise they will be sent separately.
     483
     484Example::
     485
     486    name = input("Name? ")
     487    phone = input("Phone? ")
     488    rows = db.query_formatted(
     489        "update employees set phone=%s where name=%s",
     490        (phone, name)).getresult()[0][0]
     491    # or
     492    rows = db.query_formatted(
     493        "update employees set phone=%(phone)s where name=%(name)s",
     494        dict(name=name, phone=phone)).getresult()[0][0]
    454495
    455496clear -- clear row values in memory
     
    780821    The name of the database that the connection is using
    781822
    782 
    783823.. attribute:: DB.dbtypes
    784824
     
    790830
    791831.. versionadded:: 5.0
     832
     833.. attribute:: DB.adapter
     834
     835    A class with some helper functions for adapting parameters
     836
     837This can be used for building queries with parameters.  You normally will
     838not need this, as you can use the :class:`DB.query_formatted` method.
     839
     840.. versionadded:: 5.0
  • trunk/docs/contents/pg/index.rst

    r798 r799  
    1717    notification
    1818    db_types
     19    adaptation
  • trunk/docs/contents/pg/module.rst

    r798 r799  
    500500supported by the C extension module.
    501501
     502Type helpers
     503------------
     504
     505The module provides the following type helper functions.  You can wrap
     506parameters with these functions when passing them to :meth:`DB.query_formatted`
     507in order to give PyGreSQL a hint about the type of the parameters.
     508
     509.. function:: Bytea(bytes)
     510
     511    A wrapper for holding a bytea value
     512
     513.. function:: Json(obj)
     514
     515    A wrapper for holding an object serializable to JSON
     516
     517.. function:: Literal(sql)
     518
     519    A wrapper for holding a literal SQL string
    502520
    503521Module constants
    504522----------------
     523
    505524Some constants are defined in the module dictionary.
    506525They are intended to be used as parameters for methods calls.
  • trunk/docs/contents/pgdb/adaptation.rst

    r797 r799  
    209209of a cursor.  This is done by the use of built-in typecast functions.
    210210
    211 If you want to use different typecast functions or add your own  if no
     211If you want to use different typecast functions or add your own if no
    212212built-in typecast function is available, then this is possible using
    213 the :func:`set_typecast` function.  With the :func:`get_typecast` method
     213the :func:`set_typecast` function.  With the :func:`get_typecast` function
    214214you can check which function is currently set, and :func:`reset_typecast`
    215215allows you to reset the typecast function to its default.  If no typecast
     
    230230    42.0
    231231
    232 Note that the connections cache typecast functions, so you may need to
     232Note that the connections cache the typecast functions, so you may need to
    233233reopen the database connection, or reset the cache of the connection to
    234234make this effective, using the following command::
     
    262262    [1.0, 2.0, 3.0]
    263263
    264 Note again that you may need to ``type_cache.reset_typecast()`` to make
    265 this effective.  Also note that the two types ``json`` and ``jsonb`` have
    266 their own typecast functions, so if you use ``jsonb`` instead of ``json``,
    267 you need to use this type name when setting the typecast function::
     264Note again that you may need to run ``con.type_cache.reset_typecast()`` to
     265make this effective.  Also note that the two types ``json`` and ``jsonb`` have
     266their own typecast functions, so if you use ``jsonb`` instead of ``json``, you
     267need to use this type name when setting the typecast function::
    268268
    269269    >>> pgdb.set_typecast('jsonb', cast_json)
  • trunk/docs/contents/pgdb/types.rst

    r797 r799  
    5555specific data types:
    5656
     57.. function:: Bytea(bytes)
     58
     59    Construct an object capable of holding a bytea value
     60
    5761.. function:: Json(obj, [encode])
    5862
    59     Construct a wrapper for holding an object serializable to JSON.
     63    Construct a wrapper for holding an object serializable to JSON
    6064
    6165    You can pass an optional serialization function as a parameter.
    6266    By default, PyGreSQL uses :func:`json.dumps` to serialize it.
     67
     68.. function:: Literal(sql)
     69
     70    Construct a wrapper for holding a literal SQL string
    6371
    6472Example for using a type constructor::
Note: See TracChangeset for help on using the changeset viewer.