home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 1105615625

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions issue performed_via_github_app
https://github.com/simonw/datasette/issues/1101#issuecomment-1105615625 https://api.github.com/repos/simonw/datasette/issues/1101 1105615625 IC_kwDOBm6k_c5B5lsJ 9599 2022-04-21T18:31:41Z 2022-04-21T18:32:22Z OWNER

The datasette-geojson plugin is actually an interesting case here, because of the way it converts SpatiaLite geometries into GeoJSON: https://github.com/eyeseast/datasette-geojson/blob/602c4477dc7ddadb1c0a156cbcd2ef6688a5921d/datasette_geojson/init.py#L61-L66

```python

if isinstance(geometry, bytes):
    results = await db.execute(
        "SELECT AsGeoJSON(:geometry)", {"geometry": geometry}
    )
    return geojson.loads(results.single_value())

`` That actually seems to work really well as-is, but it does worry me a bit that it ends up having to execute an extraSELECT` query for every single returned row - especially in streaming mode where it might be asked to return 1m rows at once.

My PostgreSQL/MySQL engineering brain says that this would be better handled by doing a chunk of these (maybe 100) at once, to avoid the per-query-overhead - but with SQLite that might not be necessary.

At any rate, this is one of the reasons I'm interested in "iterate over this sequence of chunks of 100 rows at a time" as a potential option here.

Of course, a better solution would be for datasette-geojson to have a way to influence the SQL query before it is executed, adding a AsGeoJSON(geometry) clause to it - so that's something I'm open to as well.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
749283032  
Powered by Datasette · Queries took 0.981ms · About: github-to-sqlite