rdflib.plugins.sparql package

Subpackages

Submodules

rdflib.plugins.sparql.aggregates module

rdflib.plugins.sparql.algebra module

rdflib.plugins.sparql.datatypes module

Utility functions for supporting the XML Schema Datatypes hierarchy

rdflib.plugins.sparql.datatypes.type_promotion(t1, t2)[source]

rdflib.plugins.sparql.evaluate module

rdflib.plugins.sparql.evalutils module

rdflib.plugins.sparql.operators module

This contains evaluation functions for expressions

They get bound as instances-methods to the CompValue objects from parserutils using setEvalFn

rdflib.plugins.sparql.operators.AdditiveExpression(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_ABS(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-abs

rdflib.plugins.sparql.operators.Builtin_BNODE(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-bnode

rdflib.plugins.sparql.operators.Builtin_BOUND(e, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-bound

rdflib.plugins.sparql.operators.Builtin_CEIL(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-ceil

rdflib.plugins.sparql.operators.Builtin_COALESCE(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-coalesce

rdflib.plugins.sparql.operators.Builtin_CONCAT(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-concat

rdflib.plugins.sparql.operators.Builtin_CONTAINS(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-strcontains

rdflib.plugins.sparql.operators.Builtin_DATATYPE(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_DAY(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_ENCODE_FOR_URI(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_EXISTS(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_FLOOR(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-floor

rdflib.plugins.sparql.operators.Builtin_HOURS(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_IF(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-if

rdflib.plugins.sparql.operators.Builtin_IRI(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-iri

rdflib.plugins.sparql.operators.Builtin_LANG(e, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-lang

Returns the language tag of ltrl, if it has one. It returns “” if ltrl has no language tag. Note that the RDF data model does not include literals with an empty language tag.

rdflib.plugins.sparql.operators.Builtin_LANGMATCHES(e, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-langMatches

rdflib.plugins.sparql.operators.Builtin_LCASE(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_MD5(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_MINUTES(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_MONTH(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_NOW(e, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-now

rdflib.plugins.sparql.operators.Builtin_RAND(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#idp2133952

rdflib.plugins.sparql.operators.Builtin_REGEX(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-regex Invokes the XPath fn:matches function to match text against a regular expression pattern. The regular expression language is defined in XQuery 1.0 and XPath 2.0 Functions and Operators section 7.6.1 Regular Expression Syntax

rdflib.plugins.sparql.operators.Builtin_REPLACE(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-substr

rdflib.plugins.sparql.operators.Builtin_ROUND(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-round

rdflib.plugins.sparql.operators.Builtin_SECONDS(e, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-seconds

rdflib.plugins.sparql.operators.Builtin_SHA1(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_SHA256(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_SHA384(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_SHA512(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_STR(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_STRAFTER(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-strafter

rdflib.plugins.sparql.operators.Builtin_STRBEFORE(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-strbefore

rdflib.plugins.sparql.operators.Builtin_STRDT(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-strdt

rdflib.plugins.sparql.operators.Builtin_STRENDS(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-strends

rdflib.plugins.sparql.operators.Builtin_STRLANG(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-strlang

rdflib.plugins.sparql.operators.Builtin_STRLEN(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_STRSTARTS(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-strstarts

rdflib.plugins.sparql.operators.Builtin_STRUUID(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-strdt

rdflib.plugins.sparql.operators.Builtin_SUBSTR(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-substr

rdflib.plugins.sparql.operators.Builtin_TIMEZONE(e, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-timezone

Returns:

the timezone part of arg as an xsd:dayTimeDuration.

Raises:

an error if there is no timezone.

rdflib.plugins.sparql.operators.Builtin_TZ(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_UCASE(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_UUID(expr, ctx)[source]

http://www.w3.org/TR/sparql11-query/#func-strdt

rdflib.plugins.sparql.operators.Builtin_YEAR(e, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_isBLANK(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_isIRI(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_isLITERAL(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_isNUMERIC(expr, ctx)[source]
rdflib.plugins.sparql.operators.Builtin_sameTerm(e, ctx)[source]
rdflib.plugins.sparql.operators.ConditionalAndExpression(e, ctx)[source]
rdflib.plugins.sparql.operators.ConditionalOrExpression(e, ctx)[source]
rdflib.plugins.sparql.operators.EBV(rt)[source]
  • If the argument is a typed literal with a datatype of xsd:boolean, the EBV is the value of that argument.

  • If the argument is a plain literal or a typed literal with a datatype of xsd:string, the EBV is false if the operand value has zero length; otherwise the EBV is true.

  • If the argument is a numeric type or a typed literal with a datatype derived from a numeric type, the EBV is false if the operand value is NaN or is numerically equal to zero; otherwise the EBV is true.

  • All other arguments, including unbound arguments, produce a type error.

rdflib.plugins.sparql.operators.Function(e, ctx)[source]

Custom functions and casts

rdflib.plugins.sparql.operators.MultiplicativeExpression(e, ctx)[source]
rdflib.plugins.sparql.operators.RelationalExpression(e, ctx)[source]
rdflib.plugins.sparql.operators.UnaryMinus(expr, ctx)[source]
rdflib.plugins.sparql.operators.UnaryNot(expr, ctx)[source]
rdflib.plugins.sparql.operators.UnaryPlus(expr, ctx)[source]
rdflib.plugins.sparql.operators.and_(*args)[source]
rdflib.plugins.sparql.operators.custom_function(uri, override=False, raw=False)[source]

Decorator version of register_custom_function().

rdflib.plugins.sparql.operators.datetime(e)[source]
rdflib.plugins.sparql.operators.default_cast(e, ctx)[source]
rdflib.plugins.sparql.operators.literal(s)[source]
rdflib.plugins.sparql.operators.not_(arg)[source]
rdflib.plugins.sparql.operators.numeric(expr)[source]

return a number from a literal http://www.w3.org/TR/xpath20/#promotion

or TypeError

rdflib.plugins.sparql.operators.register_custom_function(uri, func, override=False, raw=False)[source]

Register a custom SPARQL function.

By default, the function will be passed the RDF terms in the argument list. If raw is True, the function will be passed an Expression and a Context.

The function must return an RDF term, or raise a SparqlError.

rdflib.plugins.sparql.operators.simplify(expr)[source]
rdflib.plugins.sparql.operators.string(s)[source]

Make sure the passed thing is a string literal i.e. plain literal, xsd:string literal or lang-tagged literal

rdflib.plugins.sparql.operators.unregister_custom_function(uri, func)[source]

rdflib.plugins.sparql.parser module

rdflib.plugins.sparql.parserutils module

class rdflib.plugins.sparql.parserutils.Comp(name, expr)[source]

Bases: TokenConverter

A pyparsing token for grouping together things with a label Any sub-tokens that are not Params will be ignored.

Returns CompValue / Expr objects - depending on whether evalFn is set.

__abstractmethods__ = frozenset({})
__annotations__ = {'DEFAULT_WHITE_CHARS': 'str', '_defaultName': 'typing.Optional[str]', '_literalStringClass': 'type', 'customName': 'str', 'failAction': 'typing.Optional[ParseFailAction]', 'ignoreExprs': "List['ParserElement']", 'parseAction': 'List[ParseAction]', 'recursion_memos': "typing.Dict[Tuple[int, 'Forward', bool], Tuple[int, Union[ParseResults, Exception]]]", 'resultsName': 'str', 'suppress_warnings_': 'List[Diagnostics]', 'verbose_stacktrace': 'bool'}
__init__(name, expr)[source]
__module__ = 'rdflib.plugins.sparql.parserutils'
postParse(instring, loc, tokenList)[source]
setEvalFn(evalfn)[source]
class rdflib.plugins.sparql.parserutils.CompValue(name, **values)[source]

Bases: OrderedDict

The result of parsing a Comp Any included Params are avaiable as Dict keys or as attributes

__getattr__(a)[source]
__getitem__(a)[source]

Return self[key].

__init__(name, **values)[source]
__module__ = 'rdflib.plugins.sparql.parserutils'
__repr__()[source]

Return repr(self).

__str__()[source]

Return str(self).

clone()[source]
get(a, variables=False, errors=False)[source]

Return the value for key if key is in the dictionary, else default.

class rdflib.plugins.sparql.parserutils.Expr(name, evalfn=None, **values)[source]

Bases: CompValue

A CompValue that is evaluatable

__annotations__ = {}
__init__(name, evalfn=None, **values)[source]
__module__ = 'rdflib.plugins.sparql.parserutils'
eval(ctx={})[source]
class rdflib.plugins.sparql.parserutils.Param(name, expr, isList=False)[source]

Bases: TokenConverter

A pyparsing token for labelling a part of the parse-tree if isList is true repeat occurrences of ParamList have their values merged in a list

__abstractmethods__ = frozenset({})
__annotations__ = {'DEFAULT_WHITE_CHARS': 'str', '_defaultName': 'typing.Optional[str]', '_literalStringClass': 'type', 'customName': 'str', 'failAction': 'typing.Optional[ParseFailAction]', 'ignoreExprs': "List['ParserElement']", 'parseAction': 'List[ParseAction]', 'recursion_memos': "typing.Dict[Tuple[int, 'Forward', bool], Tuple[int, Union[ParseResults, Exception]]]", 'resultsName': 'str', 'suppress_warnings_': 'List[Diagnostics]', 'verbose_stacktrace': 'bool'}
__init__(name, expr, isList=False)[source]
__module__ = 'rdflib.plugins.sparql.parserutils'
postParse2(tokenList)[source]
class rdflib.plugins.sparql.parserutils.ParamList(name, expr)[source]

Bases: Param

A shortcut for a Param with isList=True

__abstractmethods__ = frozenset({})
__annotations__ = {'DEFAULT_WHITE_CHARS': 'str', '_defaultName': 'typing.Optional[str]', '_literalStringClass': 'type', 'customName': 'str', 'failAction': 'typing.Optional[ParseFailAction]', 'ignoreExprs': "List['ParserElement']", 'parseAction': 'List[ParseAction]', 'recursion_memos': "typing.Dict[Tuple[int, 'Forward', bool], Tuple[int, Union[ParseResults, Exception]]]", 'resultsName': 'str', 'suppress_warnings_': 'List[Diagnostics]', 'verbose_stacktrace': 'bool'}
__init__(name, expr)[source]
__module__ = 'rdflib.plugins.sparql.parserutils'
class rdflib.plugins.sparql.parserutils.ParamValue(name, tokenList, isList)[source]

Bases: object

The result of parsing a Param This just keeps the name/value All cleverness is in the CompValue

__dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.parserutils', '__doc__': '\n    The result of parsing a Param\n    This just keeps the name/value\n    All cleverness is in the CompValue\n    ', '__init__': <function ParamValue.__init__>, '__str__': <function ParamValue.__str__>, '__dict__': <attribute '__dict__' of 'ParamValue' objects>, '__weakref__': <attribute '__weakref__' of 'ParamValue' objects>, '__annotations__': {}})
__init__(name, tokenList, isList)[source]
__module__ = 'rdflib.plugins.sparql.parserutils'
__str__()[source]

Return str(self).

__weakref__

list of weak references to the object

class rdflib.plugins.sparql.parserutils.plist(iterable=(), /)[source]

Bases: list

this is just a list, but we want our own type to check for

__dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.parserutils', '__doc__': 'this is just a list, but we want our own type to check for', '__dict__': <attribute '__dict__' of 'plist' objects>, '__weakref__': <attribute '__weakref__' of 'plist' objects>, '__annotations__': {}})
__module__ = 'rdflib.plugins.sparql.parserutils'
__weakref__

list of weak references to the object

rdflib.plugins.sparql.parserutils.prettify_parsetree(t, indent='', depth=0)[source]
rdflib.plugins.sparql.parserutils.value(ctx, val, variables=False, errors=False)[source]

utility function for evaluating something…

Variables will be looked up in the context Normally, non-bound vars is an error, set variables=True to return unbound vars

Normally, an error raises the error, set errors=True to return error

rdflib.plugins.sparql.processor module

rdflib.plugins.sparql.sparql module

exception rdflib.plugins.sparql.sparql.AlreadyBound[source]

Bases: SPARQLError

Raised when trying to bind a variable that is already bound!

__init__()[source]
__module__ = 'rdflib.plugins.sparql.sparql'
class rdflib.plugins.sparql.sparql.Bindings(outer=None, d=[])[source]

Bases: MutableMapping

A single level of a stack of variable-value bindings. Each dict keeps a reference to the dict below it, any failed lookup is propegated back

In python 3.3 this could be a collections.ChainMap

__abstractmethods__ = frozenset({})
__contains__(key)[source]
__delitem__(key)[source]
__dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__doc__': '\n\n    A single level of a stack of variable-value bindings.\n    Each dict keeps a reference to the dict below it,\n    any failed lookup is propegated back\n\n    In python 3.3 this could be a collections.ChainMap\n    ', '__init__': <function Bindings.__init__>, '__getitem__': <function Bindings.__getitem__>, '__contains__': <function Bindings.__contains__>, '__setitem__': <function Bindings.__setitem__>, '__delitem__': <function Bindings.__delitem__>, '__len__': <function Bindings.__len__>, '__iter__': <function Bindings.__iter__>, '__str__': <function Bindings.__str__>, '__repr__': <function Bindings.__repr__>, '__dict__': <attribute '__dict__' of 'Bindings' objects>, '__weakref__': <attribute '__weakref__' of 'Bindings' objects>, '__abstractmethods__': frozenset(), '_abc_impl': <_abc._abc_data object>, '__annotations__': {}})
__getitem__(key)[source]
__init__(outer=None, d=[])[source]
__iter__()[source]
__len__()[source]
__module__ = 'rdflib.plugins.sparql.sparql'
__repr__()[source]

Return repr(self).

__setitem__(key, value)[source]
__str__()[source]

Return str(self).

__weakref__

list of weak references to the object

class rdflib.plugins.sparql.sparql.FrozenBindings(ctx, *args, **kwargs)[source]

Bases: FrozenDict

__abstractmethods__ = frozenset({})
__annotations__ = {}
__getitem__(key)[source]
__init__(ctx, *args, **kwargs)[source]
__module__ = 'rdflib.plugins.sparql.sparql'
property bnodes
forget(before, _except=None)[source]

return a frozen dict only of bindings made in self since before

merge(other)[source]
property now
project(vars)[source]
property prologue
remember(these)[source]

return a frozen dict only of bindings in these

class rdflib.plugins.sparql.sparql.FrozenDict(*args, **kwargs)[source]

Bases: Mapping

An immutable hashable dict

Taken from http://stackoverflow.com/a/2704866/81121

__abstractmethods__ = frozenset({})
__annotations__ = {}
__dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__doc__': '\n    An immutable hashable dict\n\n    Taken from http://stackoverflow.com/a/2704866/81121\n\n    ', '__init__': <function FrozenDict.__init__>, '__iter__': <function FrozenDict.__iter__>, '__len__': <function FrozenDict.__len__>, '__getitem__': <function FrozenDict.__getitem__>, '__hash__': <function FrozenDict.__hash__>, 'project': <function FrozenDict.project>, 'disjointDomain': <function FrozenDict.disjointDomain>, 'compatible': <function FrozenDict.compatible>, 'merge': <function FrozenDict.merge>, '__str__': <function FrozenDict.__str__>, '__repr__': <function FrozenDict.__repr__>, '__dict__': <attribute '__dict__' of 'FrozenDict' objects>, '__weakref__': <attribute '__weakref__' of 'FrozenDict' objects>, '__abstractmethods__': frozenset(), '_abc_impl': <_abc._abc_data object>, '__annotations__': {}})
__getitem__(key)[source]
__hash__()[source]

Return hash(self).

__init__(*args, **kwargs)[source]
__iter__()[source]
__len__()[source]
__module__ = 'rdflib.plugins.sparql.sparql'
__repr__()[source]

Return repr(self).

__str__()[source]

Return str(self).

__weakref__

list of weak references to the object

compatible(other)[source]
disjointDomain(other)[source]
merge(other)[source]
project(vars)[source]
exception rdflib.plugins.sparql.sparql.NotBoundError(msg=None)[source]

Bases: SPARQLError

__annotations__ = {}
__init__(msg=None)[source]
__module__ = 'rdflib.plugins.sparql.sparql'
class rdflib.plugins.sparql.sparql.Prologue[source]

Bases: object

A class for holding prefixing bindings and base URI information

__dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__doc__': '\n    A class for holding prefixing bindings and base URI information\n    ', '__init__': <function Prologue.__init__>, 'resolvePName': <function Prologue.resolvePName>, 'bind': <function Prologue.bind>, 'absolutize': <function Prologue.absolutize>, '__dict__': <attribute '__dict__' of 'Prologue' objects>, '__weakref__': <attribute '__weakref__' of 'Prologue' objects>, '__annotations__': {}})
__init__()[source]
__module__ = 'rdflib.plugins.sparql.sparql'
__weakref__

list of weak references to the object

absolutize(iri)[source]

Apply BASE / PREFIXes to URIs (and to datatypes in Literals)

TODO: Move resolving URIs to pre-processing

bind(prefix, uri)[source]
resolvePName(prefix, localname)[source]
class rdflib.plugins.sparql.sparql.Query(prologue, algebra)[source]

Bases: object

A parsed and translated query

__dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__doc__': '\n    A parsed and translated query\n    ', '__init__': <function Query.__init__>, '__dict__': <attribute '__dict__' of 'Query' objects>, '__weakref__': <attribute '__weakref__' of 'Query' objects>, '__annotations__': {}})
__init__(prologue, algebra)[source]
__module__ = 'rdflib.plugins.sparql.sparql'
__weakref__

list of weak references to the object

class rdflib.plugins.sparql.sparql.QueryContext(graph=None, bindings=None, initBindings=None)[source]

Bases: object

Query context - passed along when evaluating the query

__dict__ = mappingproxy({'__module__': 'rdflib.plugins.sparql.sparql', '__doc__': '\n    Query context - passed along when evaluating the query\n    ', '__init__': <function QueryContext.__init__>, 'clone': <function QueryContext.clone>, '_get_dataset': <function QueryContext._get_dataset>, 'dataset': <property object>, 'load': <function QueryContext.load>, '__getitem__': <function QueryContext.__getitem__>, 'get': <function QueryContext.get>, 'solution': <function QueryContext.solution>, '__setitem__': <function QueryContext.__setitem__>, 'pushGraph': <function QueryContext.pushGraph>, 'push': <function QueryContext.push>, 'clean': <function QueryContext.clean>, 'thaw': <function QueryContext.thaw>, '__dict__': <attribute '__dict__' of 'QueryContext' objects>, '__weakref__': <attribute '__weakref__' of 'QueryContext' objects>, '__annotations__': {}})
__getitem__(key)[source]
__init__(graph=None, bindings=None, initBindings=None)[source]
__module__ = 'rdflib.plugins.sparql.sparql'
__setitem__(key, value)[source]
__weakref__

list of weak references to the object

clean()[source]
clone(bindings=None)[source]
property dataset

current dataset

get(key, default=None)[source]
load(source, default=False, **kwargs)[source]
push()[source]
pushGraph(graph)[source]
solution(vars=None)[source]

Return a static copy of the current variable bindings as dict

thaw(frozenbindings)[source]

Create a new read/write query context from the given solution

exception rdflib.plugins.sparql.sparql.SPARQLError(msg=None)[source]

Bases: Exception

__annotations__ = {}
__init__(msg=None)[source]
__module__ = 'rdflib.plugins.sparql.sparql'
__weakref__

list of weak references to the object

exception rdflib.plugins.sparql.sparql.SPARQLTypeError(msg)[source]

Bases: SPARQLError

__annotations__ = {}
__init__(msg)[source]
__module__ = 'rdflib.plugins.sparql.sparql'

rdflib.plugins.sparql.update module

Module contents