home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

3 rows where issue = 989986586 sorted by updated_at descending

✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • simonw 3

issue 1

  • Try blacken-docs · 3 ✖

author_association 1

  • OWNER 3
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
914441037 https://github.com/simonw/datasette/issues/1461#issuecomment-914441037 https://api.github.com/repos/simonw/datasette/issues/1461 IC_kwDOBm6k_c42gUNN simonw 9599 2021-09-07T16:13:59Z 2021-09-07T16:13:59Z OWNER

I don't think I'll adopt it for this project. For example, here: diff response = Response.redirect("/") - response.set_cookie("ds_actor", datasette.sign({ - "a": { - "id": "cleopaws" - } - }, "actor")) + response.set_cookie("ds_actor", datasette.sign({"a": {"id": "cleopaws"}}, "actor")) I chose to use the multi-line version to help emphasize the structure - the single-line replacement loses that. I think I'll continue to make my own editorial choices about how the code examples are laid out.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Try blacken-docs 989986586  
914440282 https://github.com/simonw/datasette/issues/1461#issuecomment-914440282 https://api.github.com/repos/simonw/datasette/issues/1461 IC_kwDOBm6k_c42gUBa simonw 9599 2021-09-07T16:12:57Z 2021-09-07T16:12:57Z OWNER

Here's the diff it produced from that first run: `diff diff --git a/docs/authentication.rst b/docs/authentication.rst index 0d98cf8..8008023 100644 --- a/docs/authentication.rst +++ b/docs/authentication.rst @@ -381,11 +381,7 @@ Authentication plugins can set signedds_actor`` cookies themselves like so: .. code-block:: python

 response = Response.redirect("/")
  • response.set_cookie("ds_actor", datasette.sign({
  • "a": {
  • "id": "cleopaws"
  • }
  • }, "actor"))
  • response.set_cookie("ds_actor", datasette.sign({"a": {"id": "cleopaws"}}, "actor"))

Note that you need to pass "actor" as the namespace to :ref:datasette_sign.

@@ -412,12 +408,16 @@ To include an expiry, add a "e" key to the cookie value containing a `base62 expires_at = int(time.time()) + (24 * 60 * 60)

 response = Response.redirect("/")
  • response.set_cookie("ds_actor", datasette.sign({
  • "a": {
  • "id": "cleopaws"
  • },
  • "e": baseconv.base62.encode(expires_at),
  • }, "actor"))
  • response.set_cookie(
  • "ds_actor",
  • datasette.sign(
  • {
  • "a": {"id": "cleopaws"},
  • "e": baseconv.base62.encode(expires_at),
  • },
  • "actor",
  • ),
  • )

The resulting cookie will encode data that looks something like this:

diff --git a/docs/spatialite.rst b/docs/spatialite.rst index d1b300b..556bad8 100644 --- a/docs/spatialite.rst +++ b/docs/spatialite.rst @@ -58,19 +58,22 @@ Here's a recipe for taking a table with existing latitude and longitude columns, .. code-block:: python

 import sqlite3
  • conn = sqlite3.connect('museums.db') +
  • conn = sqlite3.connect("museums.db") # Lead the spatialite extension: conn.enable_load_extension(True)
  • conn.load_extension('/usr/local/lib/mod_spatialite.dylib')
  • conn.load_extension("/usr/local/lib/mod_spatialite.dylib") # Initialize spatial metadata for this database:
  • conn.execute('select InitSpatialMetadata(1)')
  • conn.execute("select InitSpatialMetadata(1)") # Add a geometry column called point_geom to our museums table: conn.execute("SELECT AddGeometryColumn('museums', 'point_geom', 4326, 'POINT', 2);") # Now update that geometry column with the lat/lon points
  • conn.execute('''
  • conn.execute(
  • """ UPDATE museums SET point_geom = GeomFromText('POINT('||"longitude"||' '||"latitude"||')',4326);
  • ''')
  • """
  • ) # Now add a spatial index to that column conn.execute('select CreateSpatialIndex("museums", "point_geom");') # If you don't commit your changes will not be persisted: @@ -186,13 +189,14 @@ Here's Python code to create a SQLite database, enable SpatiaLite, create a plac .. code-block:: python

    import sqlite3 - conn = sqlite3.connect('places.db') + + conn = sqlite3.connect("places.db") # Enable SpatialLite extension conn.enable_load_extension(True) - conn.load_extension('/usr/local/lib/mod_spatialite.dylib') + conn.load_extension("/usr/local/lib/mod_spatialite.dylib") # Create the masic countries table - conn.execute('select InitSpatialMetadata(1)') - conn.execute('create table places (id integer primary key, name text);') + conn.execute("select InitSpatialMetadata(1)") + conn.execute("create table places (id integer primary key, name text);") # Add a MULTIPOLYGON Geometry column conn.execute("SELECT AddGeometryColumn('places', 'geom', 4326, 'MULTIPOLYGON', 2);") # Add a spatial index against the new column @@ -201,13 +205,17 @@ Here's Python code to create a SQLite database, enable SpatiaLite, create a plac from shapely.geometry.multipolygon import MultiPolygon from shapely.geometry import shape import requests - geojson = requests.get('https://data.whosonfirst.org/404/227/475/404227475.geojson').json() + + geojson = requests.get( + "https://data.whosonfirst.org/404/227/475/404227475.geojson" + ).json() # Convert to "Well Known Text" format - wkt = shape(geojson['geometry']).wkt + wkt = shape(geojson["geometry"]).wkt # Insert and commit the record - conn.execute("INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))", ( - "Wales", wkt - )) + conn.execute( + "INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))", + ("Wales", wkt), + ) conn.commit()

Querying polygons using within() diff --git a/docs/writing_plugins.rst b/docs/writing_plugins.rst index bd60a4b..5af01f6 100644 --- a/docs/writing_plugins.rst +++ b/docs/writing_plugins.rst @@ -18,9 +18,10 @@ The quickest way to start writing a plugin is to create a my_plugin.py file

 from datasette import hookimpl

+ @hookimpl def prepare_connection(conn): - conn.create_function('hello_world', 0, lambda: 'Hello world!') + conn.create_function("hello_world", 0, lambda: "Hello world!")

If you save this in plugins/my_plugin.py you can then start Datasette like this::

@@ -60,22 +61,18 @@ The example consists of two files: a setup.py file that defines the plugin:

 from setuptools import setup
  • VERSION = '0.1'
  • VERSION = "0.1"

    setup( - name='datasette-plugin-demos', - description='Examples of plugins for Datasette', - author='Simon Willison', - url='https://github.com/simonw/datasette-plugin-demos', - license='Apache License, Version 2.0', + name="datasette-plugin-demos", + description="Examples of plugins for Datasette", + author="Simon Willison", + url="https://github.com/simonw/datasette-plugin-demos", + license="Apache License, Version 2.0", version=VERSION, - py_modules=['datasette_plugin_demos'], - entry_points={ - 'datasette': [ - 'plugin_demos = datasette_plugin_demos' - ] - }, - install_requires=['datasette'] + py_modules=["datasette_plugin_demos"], + entry_points={"datasette": ["plugin_demos = datasette_plugin_demos"]}, + install_requires=["datasette"], )

And a Python module file, datasette_plugin_demos.py, that implements the plugin: @@ -88,12 +85,12 @@ And a Python module file, datasette_plugin_demos.py, that implements the plu

 @hookimpl
 def prepare_jinja2_environment(env):
  • env.filters['uppercase'] = lambda u: u.upper()
  • env.filters["uppercase"] = lambda u: u.upper()

    @hookimpl def prepare_connection(conn): - conn.create_function('random_integer', 2, random.randint) + conn.create_function("random_integer", 2, random.randint)

Having built a plugin in this way you can turn it into an installable package using the following command:: @@ -123,11 +120,13 @@ To bundle the static assets for a plugin in the package that you publish to PyPI

.. code-block:: python

  • package_data={
  • 'datasette_plugin_name': [
  • 'static/plugin.js',
  • ],
  • },
  • package_data = (
  • {
  • "datasette_plugin_name": [
  • "static/plugin.js",
  • ],
  • },
  • )

Where datasette_plugin_name is the name of the plugin package (note that it uses underscores, not hyphens) and static/plugin.js is the path within that package to the static file.

@@ -152,11 +151,13 @@ Templates should be bundled for distribution using the same package_data mec

.. code-block:: python

  • package_data={
  • 'datasette_plugin_name': [
  • 'templates/my_template.html',
  • ],
  • },
  • package_data = (
  • {
  • "datasette_plugin_name": [
  • "templates/my_template.html",
  • ],
  • },
  • )

You can also use wildcards here such as templates/*.html. See datasette-edit-schema <https://github.com/simonw/datasette-edit-schema>__ for an example of this pattern. ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Try blacken-docs 989986586  
914439356 https://github.com/simonw/datasette/issues/1461#issuecomment-914439356 https://api.github.com/repos/simonw/datasette/issues/1461 IC_kwDOBm6k_c42gTy8 simonw 9599 2021-09-07T16:11:37Z 2021-09-07T16:11:37Z OWNER

(datasette) datasette % blacken-docs docs/*.rst docs/authentication.rst: Rewriting... docs/internals.rst:169: code block parse error Cannot parse: 14:0: <line number missing in source> docs/plugin_hooks.rst:251: code block parse error Cannot parse: 6:4: ] docs/plugin_hooks.rst:312: code block parse error Cannot parse: 38:0: <line number missing in source> docs/spatialite.rst: Rewriting... docs/testing_plugins.rst:135: code block parse error Cannot parse: 5:0: <line number missing in source> docs/writing_plugins.rst: Rewriting...

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Try blacken-docs 989986586  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 19.18ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows