home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

505 rows where repo = 107914493, state = "open" and type = "issue" sorted by updated_at descending

✖
✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: milestone, comments, author_association, created_at (date), updated_at (date)

type 1

  • issue · 505 ✖

state 1

  • open · 505 ✖

repo 1

  • datasette · 505 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
2029908157 I_kwDOBm6k_c54_fC9 2214 CSV export fails for some `text` foreign key references precipice 2874 open 0     1 2023-12-07T05:04:34Z 2023-12-07T07:36:34Z   NONE  

I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research.

I'm using Datasette with the SWITRS data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com.

Their data makes extensive use of codes for incident column data (1 for Monday and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or -. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read.

If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail.

The foreign key configuration is as follows:

```

Some tables use integer ids, like sensible tables do. Let's import them first

since we favor them.

for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY do sqlite-utils create-table records.db $TABLE id integer name text --pk=id sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id sqlite-utils create-index records.db collisions $TABLE done

Other tables use letter keys, like they were raised by WOLVES. Let's put them

at the end of the import queue.

for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \ PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \ PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \ STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP do sqlite-utils create-table records.db $TABLE key text name text --pk=key sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key sqlite-utils create-index records.db collisions $TABLE done ```

You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db

If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the "advanced" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output:

``` INFO: 127.0.0.1:57885 - "GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1" 200 OK Caught this error:

```

(No other output follows error: other than a blank line.)

I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2214/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
2028698018 I_kwDOBm6k_c5463mi 2213 feature request: gzip compression of database downloads fgregg 536941 open 0     1 2023-12-06T14:35:03Z 2023-12-06T15:05:46Z   CONTRIBUTOR  

At the bottom of database pages, datasette gives users the opportunity to download the underlying sqlite database. It would be great if that could be served gzip compressed.

this is similar to #1213, but for me, i don't need datasette to compress html and json because my CDN layer does it for me, however, cloudflare at least, will not compress a mimetype of "application"

(see list of mimetype: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2213/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
2023057255 I_kwDOBm6k_c54lWdn 2212 Can't filter with numbers fzakaria 605070 open 0     0 2023-12-04T05:26:29Z 2023-12-04T05:26:29Z   NONE  

I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean). I can't seem to get the facet to work or even filtering on this column.

My guess is that Datasette is "stringifying" the number and it's not matching? Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2212/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
2019811176 I_kwDOBm6k_c54Y99o 2211 Unreachable exception handlers for `sqlite3.OperationalError` mattparmett 1214074 open 0     0 2023-12-01T00:50:22Z 2023-12-01T00:50:22Z   NONE  

There are several places where sqlite3.OperationalError is caught as part of an exception handler which catches multiple exceptions, but is then caught again immediately afterwards by a dedicated exception handler.

Because the exception will be caught by the first handler, the logic in the second handler is unreachable and will never be executed. If this is intended behavior, the second handler can be removed. If this is not intended, and the second handler should be the one that catches this exception, then sqlite3.OperationalError should be removed from the tuple of exceptions in the first handler.

This issue was found via a CodeQL query on the repository, and I've listed the occurrences found by the query below. There may be other instances of this issue in the code that were not surfaced by the query. I'd be happy to share the query if others would like to view or run it.

One example:

https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/views/database.py#L534-L537

Other instances:

https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L266-L270 https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L452-L456

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2211/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
564833696 MDU6SXNzdWU1NjQ4MzM2OTY= 670 Prototoype for Datasette on PostgreSQL simonw 9599 open 0     15 2020-02-13T17:17:55Z 2023-11-17T15:32:21Z   OWNER  

I thought this would never happen, but now that I'm deep in the weeds of running SQLite in production for Datasette Cloud I'm starting to reconsider my policy of only supporting SQLite.

Some of the factors making me think PostgreSQL support could be worth the effort: - Serverless. I'm getting increasingly excited about writable-database use-cases for Datasette. If it could talk to PostgreSQL then users could easily deploy it on Heroku or other serverless providers that can talk to a managed RDS-style PostgreSQL. - Existing databases. Plenty of organizations have PostgreSQL databases. They can export to SQLite using db-to-sqlite but that's a pretty big barrier to getting started - being able to run datasette postgresql://connection-string and start trying it out would be a massively better experience. - Data size. I keep running into use-cases where I want to run Datasette against many GBs of data. SQLite can do this but PostgreSQL is much more optimized for large data, especially given the existence of tools like Citus. - Marketing. Convincing people to trust their data to SQLite is potentially a big barrier to adoption. Even if I've convinced myself it's trustworthy I still have to convince everyone else. - It might not be that hard? If this required a ground-up rewrite it wouldn't be worth the effort, but I have a hunch that it may not be too hard - most of the SQL in Datasette should work on both databases since it's almost all portable SELECT statements. If Datasette did DML this would be a lot harder, but it doesn't. - Plugins! This feels like a natural surface for a plugin - at which point people could add MySQL support and suchlike in the future.

The above reasons feel strong enough to justify a prototype.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/670/reactions",
    "total_count": 19,
    "+1": 14,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 5,
    "rocket": 0,
    "eyes": 0
}
   
1994857251 I_kwDOBm6k_c525xsj 2208 No suggested facets when a column named 'value' is included rgieseke 198537 open 0     1 2023-11-15T14:11:17Z 2023-11-15T14:18:59Z   CONTRIBUTOR  

When a column named 'value' is included there are no suggested facets is shown as the query uses an alias of 'value'.

https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L168-L174

Currently the following is shown (from https://latest.datasette.io/fixtures/facetable)

When I add a column named 'value' only the JSON facets are processed.

I think that not using aliases could be a solution (except if someone wants to use a column named count(*) though this seems to be unlikely). I'll open a PR with that.

There is also a TODO with a similar question in the same file. I have not looked into that yet.

https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L512

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2208/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1994845152 I_kwDOBm6k_c525uvg 2207 ModuleNotFoundError: No module named 'click_default_group honzajavorek 283441 open 0     0 2023-11-15T14:04:32Z 2023-11-15T14:04:32Z   NONE  

No matter what I do, I'm getting this error:

$ datasette Traceback (most recent call last): File "/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/bin/datasette", line 5, in <module> from datasette.cli import cli File "/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/lib/python3.11/site-packages/datasette/cli.py", line 6, in <module> from click_default_group import DefaultGroup ModuleNotFoundError: No module named 'click_default_group'

I have datasette in my dependencies like this:

toml [tool.poetry.group.dev.dependencies] datasette = {version = "1.0a7", allow-prereleases = true}

I had the latest regular version (not pre-release) there originally, but the result was the same:

toml [tool.poetry.group.dev.dependencies] datasette = "0.64.5"

Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2207/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1978023780 I_kwDOBm6k_c515j9k 2205 request.post_vars() method obliterates form keys with multiple values simonw 9599 open 0   Datasette 1.0a-next 8755003 3 2023-11-05T23:25:08Z 2023-11-06T04:10:34Z   OWNER  

https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L137-L139

In GET requests you can do ?foo=1&foo=2 - you can do the same in POST requests, but the dict() call here eliminates those duplicates.

You can't even try calling post_body() and implement your own custom parsing because of: - #2204

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2205/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1978022687 I_kwDOBm6k_c515jsf 2204 request.post_body() can only be called once simonw 9599 open 0     0 2023-11-05T23:22:03Z 2023-11-05T23:23:23Z   OWNER  

This code here:

https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L127-L135

It consumes the messages, which means if you try to call it a second time you won't be able to get at the body.

This is efficient - we don't end up with a request object property with potentially megabytes of content that we never look at again - but it's inconvenient for cases like middleware or functions where we don't know if the body has been consumed yet or not.

Potential solution: set request._body the first time it is called, and return that on subsequent calls.

Potential optimization: only do this for bodies that are shorter than a certain threshold - maybe 1MB - and raise an exception if you attempt to call post_body() multiple times against one of those larger bodies.

I'm a bit nervous about that option though, since it could result in errors that don't show up in testing but do show up in production.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2204/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
959137143 MDU6SXNzdWU5NTkxMzcxNDM= 1415 feature request: document minimum permissions for service account for cloudrun fgregg 536941 open 0     4 2021-08-03T13:48:43Z 2023-11-05T16:46:59Z   CONTRIBUTOR  

Thanks again for such a powerful project.

For deploying to cloudrun from github actions, I'd like to create a service account with minimal permissions.

It would be great to document what those minimum permission that need to be set in the IAM.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1415/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1977726056 I_kwDOBm6k_c514bRo 2203 custom plugin not seen as sql function LyzardKing 7113541 open 0     0 2023-11-05T10:30:19Z 2023-11-05T10:30:19Z   NONE  

Hi, I'm not sure if this is the right repo for this issue.

I'm using datasette with the parquet (to read a duckdb), and jellyfish plugins. Both work perfectly.

Now I need to create a simple plugin that uses the python rouge package and returns a similarity score (similarly to how the jellyfish plugin works). If I create a custom plugin, even the example hello_world one, copied directly from the tutorial, I get the following error: duckdb.duckdb.CatalogException: Catalog Error: Scalar Function with name hello_world does not exist!

Since the jellyfish plugin doesn't do anything more complex, I'm wondering if there is some other kind of issue with my setup.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2203/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1955676270 I_kwDOBm6k_c50kUBu 2201 Discord invite link is invalid andrewsanchez 11708906 open 0     0 2023-10-21T21:50:05Z 2023-10-21T21:50:05Z   NONE  

https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2201/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1163369515 I_kwDOBm6k_c5FV5wr 1655 query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data fgregg 536941 open 0     8 2022-03-09T00:56:40Z 2023-10-17T21:53:17Z   CONTRIBUTOR  

this page

is using about 400 mb in firefox 97 on mac os x. if you download the html for the page, it's about 11mb and if you get the csv for the data its about 1mb.

it's using over a 1G on chrome 99.

i found this because, i was trying to figure out why editing the SQL was getting very slow.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1655/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1940346034 I_kwDOBm6k_c5zp1Sy 2199 Detailed upgrade instructions for metadata.yaml -> datasette.yaml simonw 9599 open 0   Datasette 1.0 3268330 7 2023-10-12T16:21:25Z 2023-10-12T22:08:42Z   OWNER  

Exception: Datasette no longer accepts plugin configuration in --metadata. Move your "plugins" configuration blocks to a separate file - we suggest calling that datasette..json - and start Datasette with datasette -c datasette..json. See https://docs.datasette.io/en/latest/configuration.html for more details.

I think we should link directly to documentation that tells people how to perform this upgrade.

Originally posted by @simonw in https://github.com/simonw/datasette/issues/2190#issuecomment-1759947021

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2199/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1931794126 I_kwDOBm6k_c5zJNbO 2198 --load-extension=spatialite not working with Windows hcarter333 363004 open 0     0 2023-10-08T12:50:22Z 2023-10-08T12:50:22Z   NONE  

Using each of python -m datasette counties.db -m metadata.yml --load-extension=SpatiaLite

and

python -m datasette counties.db --load-extension="C:\Windows\System32\mod_spatialite.dll"

and

python -m datasette counties.db --load-extension=C:\Windows\System32\mod_spatialite.dll

I got the error:

``` File "C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\database.py", line 209, in in_thread self.ds._prepare_connection(conn, self.name) File "C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\app.py", line 596, in _prepare_connection conn.execute("SELECT load_extension(?, ?)", [path, entrypoint]) sqlite3.OperationalError: The specified module could not be found.

```

I finally tried modifying the code in app.py to read:

``` def _prepare_connection(self, conn, database): conn.row_factory = sqlite3.Row conn.text_factory = lambda x: str(x, "utf-8", "replace") if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: # "extension" is either a string path to the extension # or a 2-item tuple that specifies which entrypoint to load. #if isinstance(extension, tuple): # path, entrypoint = extension # conn.execute("SELECT load_extension(?, ?)", [path, entrypoint]) #else: conn.execute("SELECT load_extension('C:\Windows\System32\mod_spatialite.dll')")

``` At which point the counties example worked.

Is there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used.

On an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line: python -m datasette rm_toucans_23_10_07.db counties.db -m metadata.yml --load-extension=SpatiaLite

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2198/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
777333388 MDU6SXNzdWU3NzczMzMzODg= 1168 Mechanism for storing metadata in _metadata tables simonw 9599 open 0     21 2021-01-01T18:47:27Z 2023-09-28T18:29:05Z   OWNER  

Original title: Perhaps metadata should all live in a _metadata in-memory database

Inspired by #1150 - metadata should be exposed as an API, and for large Datasette instances that API may need to be paginated. So why not expose it through an in-memory database table?

One catch to this: plugins. #860 aims to add a plugin hook for metadata. But if the metadata comes from an in-memory table, how do the plugins interact with it?

The need to paginate over metadata does make a plugin hook that returns metadata for an individual table seem less wise, since we don't want to have to do 10,000 plugin hook invocations to show a list of all metadata.

If those plugins write directly to the in-memory table how can their contributions survive the server restarting?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1168/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1907765514 I_kwDOBm6k_c5xtjEK 2195 `datasette publish` needs support for the new config/metadata split simonw 9599 open 0     9 2023-09-21T21:08:12Z 2023-09-21T22:57:48Z   OWNER  

... which raises the challenge that datasette publish doesn't yet know what to do with a config file!

Originally posted by @simonw in https://github.com/simonw/datasette/issues/2194#issuecomment-1730259871

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2195/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1825007061 I_kwDOBm6k_c5sx2XV 2123 datasette serve when invoked with --reload interprets the serve command as a file cadeef 79087 open 0     2 2023-07-27T19:07:22Z 2023-09-18T13:02:46Z   NONE  

When running datasette serve with the --reload flag, the serve command is picked up as a file argument:

$ datasette serve --reload test_db Starting monitor for PID 13574. Error: Invalid value for '[FILES]...': Path 'serve' does not exist. Press ENTER or change a file to reload.

If a 'serve' file is created it launches properly (albeit with an empty database called serve):

$ touch serve; datasette serve --reload test_db Starting monitor for PID 13628. INFO: Started server process [13628] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)

Version (running from HEAD on main):

$ datasette --version datasette, version 1.0a2

This issue appears to have existed for awhile as https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 mentions the error in a different context.

I'm happy to debug and land a patch if it's welcome.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2123/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1900026059 I_kwDOBm6k_c5xQBjL 2188 Plugin Hooks for "compile to SQL" languages asg017 15178711 open 0     2 2023-09-18T01:37:15Z 2023-09-18T06:58:53Z   CONTRIBUTOR  

There's a ton of tools/languages that compile to SQL, which may be nice in Datasette. Some examples:

  • Logica https://logica.dev
  • PRQL https://prql-lang.org
  • Malloy, but not sure if it works with SQLite? https://github.com/malloydata/malloy

It would be cool if plugins could extend Datasette to use these languages, in both the code editor and API usage.

A few things I'd imagine a datasette-prql or datasette-logica plugin would do:

  • prql= instead of sql=
  • Code editor support (syntax highlighting, autocomplete)
  • Hide/show SQL
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2188/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
787098345 MDU6SXNzdWU3ODcwOTgzNDU= 1191 Ability for plugins to collaborate when adding extra HTML to blocks in default templates simonw 9599 open 0   Datasette 1.0 3268330 12 2021-01-15T18:18:51Z 2023-09-18T06:55:52Z   OWNER  

Sometimes a plugin may want to add content to an existing default template - for example datasette-search-all adds a new search box at the top of index.html. I also want datasette-upload-csvs to add a CTA on the database.html page: https://github.com/simonw/datasette-upload-csvs/issues/18

Currently plugins can do this by providing a new version of the index.html template - but if multiple plugins try to do that only one of them will succeed.

It would be better if there were known areas of those templates which plugins could add additional content to, such that multiple plugins can use the same spot.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1191/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1899310542 I_kwDOBm6k_c5xNS3O 2187 Datasette for serving JSON only geofinder 19705106 open 0     0 2023-09-16T05:48:29Z 2023-09-16T05:48:29Z   NONE  

Hi, is there any way to use datasette for serving json only without displaying webpage? I've tried to search about this in documentation but didn't get any information

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2187/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1898927976 I_kwDOBm6k_c5xL1do 2186 Mechanism for register_output_renderer hooks to access full count simonw 9599 open 0   Datasette 1.0 3268330 2 2023-09-15T18:57:54Z 2023-09-15T19:27:59Z   OWNER  

The cause of this bug: - https://github.com/simonw/datasette-export-notebook/issues/17

Is that datasette-export-notebook was consulting data["filtered_table_rows_count"] in the render output plugin function in order to show the total number of rows that would be exported.

That field is no longer available by default - the "count" field is only available if ?_extra=count was passed.

It would be useful if plugins like this could access the total count on demand, should they need to.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2186/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1895266807 I_kwDOBm6k_c5w93n3 2184 Design decision - should configuration be exposed at /-/config ? simonw 9599 open 0     0 2023-09-13T21:07:08Z 2023-09-13T21:07:38Z   OWNER  

This made me think. That {"$env": "ENV_VAR"} hack was introduced back here:

  • https://github.com/simonw/datasette/issues/538

The problem it was solving was that metadata was visible to everyone with access to the instance at /-/metadata but plugins clearly needed a way to set secret settings.

Now that this stuff is moving to config, we have some decisions to make:

  1. Add /-/config to let people see the configuration of their instance, and keep the $env trick for secret settings.
  2. Say all configuration aside from metadata is secret and make $env optional or ditch it entirely.
  3. Allow plugins to announce which of their configuration options are secret so we can automatically redact them from /-/config

I've found /-/metadata extraordinarily useful as a user of Datasette - it really helps me understand exactly what's going on if I run into any problems with a plugin, if I can quickly check what the settings look like.

So I'm leaning towards option 1 or 3.

Originally posted by @simonw in https://github.com/simonw/datasette/pull/2183#discussion_r1325076924

Also refs: - #2093

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2184/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1781530343 I_kwDOBm6k_c5qL_7n 2093 Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File asg017 15178711 open 0     8 2023-06-29T21:18:23Z 2023-09-11T20:19:32Z   CONTRIBUTOR  

Very often I get tripped up when trying to configure my Datasette instances. For example: if I want to change the port my app listen too, do I do that with a CLI flag, a --setting flag, inside metadata.json, or an env var? If I want to up the time limit of SQL statements, is that under metadata.json or a setting? Where does my plugin configuration go?

Normally I need to look it up in Datasette docs, and I quickly find my answer, but the number of places where "config" goes it overwhelming.

  • Flat CLI flags like --port, --host, --cors, etc.
  • --setting, like default_page_size, sql_time_limit_ms etc
  • Inside metadata.json, including plugin configuration

Typically my Datasette deploys are extremely long shell commands, with multiple --setting and other CLI flags.

Proposal: Consolidate all "config" into datasette.toml

I propose that we add a new datasette.toml that combines "settings", "metadata", and other common CLI flags like --port and --cors into a single file. It would be similar to "Cargo.toml" in Rust projects, "package.json" in Node projects, and "pyproject.toml" in Python, etc.

A sample of what it could look like:

```toml

"top level" configuration that are currently CLI flags on datasette serve

[config] port = 8020 host = "0.0.0.0" cors = true

replaces multiple --setting flags

[settings] base_url = "/app/datasette/" default_allow_sql = true sql_time_limit_ms = 3500

replaces metadata.json.

The contents of datasette-metadata.json could be defined in this file instead, but supporting separate files is nice (since those are easy to machine-generate)

[metadata] include="./datasette-metadata.json"

plugin-specific

[plugins] [plugins.datasette-auth-github] client_id = {env = "DATASETTE_AUTH_GITHUB_CLIENT_ID"} client_secret = {env = "GITHUB_CLIENT_SECRET"}

[plugins.datasette-cluster-map]

latitude_column = "lat" longitude_column = "lon" ```

Pros

  • Instead of multiple files and CLI flags, everything could be in one tidy file
  • Editing config in a separate file is easier than editing CLI flags, since you don't have to kill a process + edit a command every time
  • New users will know "just edit my datasette.toml instead of needing to learn metadata + settings + CLI flags
  • Better dev experience for multiple environment. For example, could have datasette -c datasette-dev.toml for local dev environments (enables SQL, debug plugins, long timeouts, etc.), and a datasette -c datasette-prod.toml for "production" (lower timeouts, less plugins, monitoring plugins, etc.)

Cons

  • Yet another config-management system. Now Datasette users will need to know about metadata, settings, CLI flags, and datasette.toml. However with enough documentation + announcements + examples, I think we can get ahead of it.
  • If toml is chosen, would need to add a toml parser for Python version <3.11
  • Multiple sources of config require priority. For example: Would --setting default_allow_sql off override the value inside [settings]? What about --port?

Other Notes

Toml

I chose toml over json because toml supports comments. I chose toml over yaml because Python 3.11 has builtin support for it. I also find toml easier to work with since it doesn't have the odd "gotchas" that YAML has ("ex 3.10 resolving to 3.1, Norway NO resolving to false, etc.). It also mimics pyproject.toml which is nice. Happy to change my mind about this however

Plugin config will be difficult

Plugin config is currently in metadata.json in two places:

  1. Top level, under "plugins.[plugin-name]". This fits well into datasette.toml as [plugins.plugin-name]
  2. Table level, under "databases.[db-name].tables.[table-name].plugins.[plugin-name]. This doesn't fit that well into datasette.toml, unless it's nested under [metadata]?

Extensions, static, one-off plugins?

We could also include equivalents of --plugins-dir, --static, and --load-extension into datasette.toml, but I'd imagine there's a few security concerns there to think through.

Explicitly list with plugins to use?

I believe Datasette by default will load all install plugins on startup, but maybe datasette.toml can specify a list of plugins to use? For example, a dev version of datasette.toml can specify datasette-pretty-traces, but the prod version can leave it out

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2093/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1876353656 I_kwDOBm6k_c5v1uJ4 2168 Consider a request/response wrapping hook slightly higher level than asgi_wrapper() simonw 9599 open 0     6 2023-08-31T21:42:04Z 2023-09-10T17:54:08Z   OWNER  

There's a long justification for why this might be needed here: - https://github.com/simonw/datasette-auth-tokens/issues/10#issuecomment-1701820001

Short version: it would be neat if it was possible to stash some data on the request object such that a later plugin/middleware-type-thing could use that to influence the final returned response - similar to the kinds of things you can do with Django middleware.

The asgi_wrapper() mechanism doesn't have access to the request or response objects - it gets scope and can mess around with receive and send, but those are pretty low-level primitives.

Since Datasette has well-defined request and response objects now it might be nice to have a middleware layer that can manipulate those directly.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2168/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1010112818 I_kwDOBm6k_c48NRky 1479 Win32 "used by another process" error with datasette publish kirajano 76450761 open 0     7 2021-09-28T19:12:00Z 2023-09-07T02:14:16Z   NONE  

I unfortunately was not successful to deploy to fly.io. Please see the details above of the three scenarios that I took. I am also new to datasette.

Failed to deploy. Attaching logs: 1. Tried with an app created via flyctl apps create frosty-fog-8565 and the ran datasette publish fly covid.db --app frosty-fog-8565 ``` Deploying frosty-fog-8565 ==> Validating app configuration --> Validating app configuration done Services TCP 80/443 ⇢ 8080

Error error connecting to docker: An unknown error occured.

Traceback (most recent call last): File "c:\users\grott\anaconda3\lib\runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "c:\users\grott\anaconda3\lib\runpy.py", line 85, in _run_code exec(code, run_globals) File "C:\Users\grott\Anaconda3\Scripts\datasette.exe__main__.py", line 7, in <module> File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 829, in call return self.main(args, kwargs) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 782, in main rv = self.invoke(ctx) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 1066, in invoke return ctx.invoke(self.callback, ctx.params) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 610, in invoke return callback(args, **kwargs) File "c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly__init__.py", line 156, in fly "--remote-only", File "c:\users\grott\anaconda3\lib\contextlib.py", line 119, in exit next(self.gen) File "c:\users\grott\anaconda3\lib\site-packages\datasette\utils__init__.py", line 451, in temporary_docker_directory tmp.cleanup() File "c:\users\grott\anaconda3\lib\tempfile.py", line 811, in cleanup _shutil.rmtree(self.name) File "c:\users\grott\anaconda3\lib\shutil.py", line 516, in rmtree return _rmtree_unsafe(path, onerror) File "c:\users\grott\anaconda3\lib\shutil.py", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File "c:\users\grott\anaconda3\lib\shutil.py", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File "c:\users\grott\anaconda3\lib\shutil.py", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\Users\grott\AppData\Local\Temp\tmpgcm8cz66\frosty-fog-8565' ```

  1. Tried also with an app that gets autogenerate when running flyctl launch. This also generates the .toml file. Ran then datasette publish fly covid.db --app dark-feather-168 but different error now ```Deploying dark-feather-168 ==> Validating app configuration

Error not possible to validate configuration: server returned Post "https://api.fly.io/graphql": unexpected EOF

Traceback (most recent call last): File "c:\users\grott\anaconda3\lib\runpy.py", line 193, in _run_module_as_main
"main", mod_spec) exec(code, run_globals) File "C:\Users\grott\Anaconda3\Scripts\datasette.exe__main__.py", line 7, in <module> File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 829, in call return self.main(args, kwargs) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 782, in main rv = self.invoke(ctx) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 1066, in invoke return ctx.invoke(self.callback, ctx.params) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 610, in invoke return callback(args, **kwargs) File "c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly__init__.py", line 156, in fly "--remote-only", File "c:\users\grott\anaconda3\lib\contextlib.py", line 119, in exit next(self.gen) File "c:\users\grott\anaconda3\lib\site-packages\datasette\utils__init__.py", line 451, in temporary_docker_directory tmp.cleanup() File "c:\users\grott\anaconda3\lib\tempfile.py", line 811, in cleanup _shutil.rmtree(self.name) File "c:\users\grott\anaconda3\lib\shutil.py", line 516, in rmtree return _rmtree_unsafe(path, onerror) File "c:\users\grott\anaconda3\lib\shutil.py", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File "c:\users\grott\anaconda3\lib\shutil.py", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File "c:\users\grott\anaconda3\lib\shutil.py", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\Users\grott\AppData\Local\Temp\tmpnoyewcre\dark-feather-168' ```

These are also the contents of the generated .toml file in 2 scenario:

```

fly.toml file generated for dark-feather-168 on 2021-09-28T20:35:44+02:00

app = "dark-feather-168"

kill_signal = "SIGINT" kill_timeout = 5 processes = []

[env]

[experimental] allowed_public_ports = [] auto_rollback = true

[[services]] http_checks = [] internal_port = 8080 processes = ["app"] protocol = "tcp" script_checks = []

[services.concurrency] hard_limit = 25 soft_limit = 20 type = "connections"

[[services.ports]] handlers = ["http"] port = 80

[[services.ports]] handlers = ["tls", "http"] port = 443

[[services.tcp_checks]] grace_period = "1s" interval = "15s" restart_limit = 6 timeout = "2s" ```

  1. But also trying datasette package covid.db to create a local DOCKERFILE to later try to push it via flyctl deploy fails as well.

```[+] Building 147.3s (11/11) FINISHED => [internal] load build definition from Dockerfile 0.2s => => transferring dockerfile: 396B 0.0s => [internal] load .dockerignore 0.1s => => transferring context: 2B 0.0s => [internal] load metadata for docker.io/library/python:3.8 4.7s => [auth] library/python:pull token for registry-1.docker.io 0.0s => [internal] load build context 0.1s => => transferring context: 82.37kB 0.0s => [1/5] FROM docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3 108.3s => => resolve docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5 0.0s => => sha256:56182bcdf4d4283aa1f46944b4ef7ac881e28b4d5526720a4e9ba03a4730846a 2.22kB / 2.22kB 0.0s => => sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 54.93MB / 54.93MB 21.6s => => sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 10.87MB / 10.87MB 15.5s => => sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5c5b6 1.86kB / 1.86kB 0.0s => => sha256:ff08f08727e50193dcf499afc30594c47e70cc96f6fcfd1a01240524624264d0 8.65kB / 8.65kB 0.0s => => sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 5.15MB / 5.15MB 6.4s => => sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 54.57MB / 54.57MB 37.7s => => sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 196.45MB / 196.45MB 82.3s => => sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 6.29MB / 6.29MB 26.6s => => extracting sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 5.4s => => sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 16.81MB / 16.81MB 40.5s => => extracting sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 0.6s => => extracting sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 0.6s => => sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 232B / 232B 40.1s => => extracting sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 5.8s => => sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 2.35MB / 2.35MB 41.8s => => extracting sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 18.8s => => extracting sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 1.2s => => extracting sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 2.9s => => extracting sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 0.0s => => extracting sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 0.8s => [2/5] COPY . /app 2.3s => [3/5] WORKDIR /app 0.2s => [4/5] RUN pip install -U datasette 26.9s => [5/5] RUN datasette inspect covid.db --inspect-file inspect-data.json 3.1s => exporting to image 1.2s => => exporting layers 1.2s => => writing image sha256:b5db0c205cd3454c21fbb00ecf6043f261540bcf91c2dfc36d418f1a23a75d7a 0.0s

Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them Traceback (most recent call last): "main", mod_spec) File "c:\users\grott\anaconda3\lib\runpy.py", line 85, in _run_code exec(code, run_globals) File "C:\Users\grott\Anaconda3\Scripts\datasette.exe__main__.py", line 7, in <module> File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 829, in call return self.main(args, kwargs) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 782, in main rv = self.invoke(ctx) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 1066, in invoke return ctx.invoke(self.callback, ctx.params) File "c:\users\grott\anaconda3\lib\site-packages\click\core.py", line 610, in invoke return callback(args, **kwargs) File "c:\users\grott\anaconda3\lib\site-packages\datasette\cli.py", line 283, in package call(args) File "c:\users\grott\anaconda3\lib\contextlib.py", line 119, in exit next(self.gen) File "c:\users\grott\anaconda3\lib\site-packages\datasette\utils__init__.py", line 451, in temporary_docker_directory tmp.cleanup() File "c:\users\grott\anaconda3\lib\tempfile.py", line 811, in cleanup _shutil.rmtree(self.name) File "c:\users\grott\anaconda3\lib\shutil.py", line 516, in rmtree return _rmtree_unsafe(path, onerror) File "c:\users\grott\anaconda3\lib\shutil.py", line 395, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File "c:\users\grott\anaconda3\lib\shutil.py", line 404, in _rmtree_unsafe onerror(os.rmdir, path, sys.exc_info()) File "c:\users\grott\anaconda3\lib\shutil.py", line 402, in _rmtree_unsafe os.rmdir(path) PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\Users\grott\AppData\Local\Temp\tmpkb27qid3\datasette'```

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1479/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1884408624 I_kwDOBm6k_c5wUcsw 2177 Move schema tables from _internal to _catalog simonw 9599 open 0     1 2023-09-06T16:58:33Z 2023-09-06T17:04:30Z   OWNER  

This came up in discussion over: - https://github.com/simonw/datasette/pull/2174

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2177/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1875739055 I_kwDOBm6k_c5vzYGv 2167 Document return type of await ds.permission_allowed() simonw 9599 open 0     0 2023-08-31T15:14:23Z 2023-08-31T15:14:23Z   OWNER  

The return type isn't documented here: https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/docs/internals.rst#L327-L350

On inspecting the code I'm not 100% sure if it's possible for this. method to return None, or if it can only return True or False. Need to confirm that.

https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/datasette/app.py#L822C15-L853

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2167/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1865869205 I_kwDOBm6k_c5vNueV 2157 Proposal: Make the `_internal` database persistent, customizable, and hidden asg017 15178711 open 0     3 2023-08-24T20:54:29Z 2023-08-31T02:45:56Z   CONTRIBUTOR  

The current _internal database is used by Datasette core to cache info about databases/tables/columns/foreign keys of databases in a Datasette instance. It's a temporary database created at startup, that can only be seen by the root user. See an example _internal DB here, after logging in as root.

The current _internal database has a few rough edges:

  • It's part of datasette.databases, so many plugins have to specifically exclude _internal from their queries examples here
  • It's only used by Datasette core and can't be used by plugins or 3rd parties
  • It's created from scratch at startup and stored in memory. Why is fine, the performance is great, but persistent storage would be nice.

Additionally, it would be really nice if plugins could use this _internal database to store their own configuration, secrets, and settings. For example:

  • datasette-auth-tokens creates a _datasette_auth_tokens table to store auth token metadata. This could be moved into the _internal database to avoid writing to the gues database
  • datasette-socrata creates a socrata_imports table, which also can be in _internal
  • datasette-upload-csvs creates a _csv_progress_ table, which can be in _internal
  • datasette-write-ui wants to have the ability for users to toggle whether a table appears editable, which can be either in datasette.yaml or on-the-fly by storing config in _internal

In general, these are specific features that Datasette plugins would have access to if there was a central internal database they could read/write to:

  • Dynamic configuration. Changing the datasette.yaml file works, but can be tedious to restart the server every time. Plugins can define their own configuration table in _internal, and could read/write to it to store configuration based on user actions (cell menu click, API access, etc.)
  • Caching. If a plugin or Datasette Core needs to cache some expensive computation, they can store it inside _internal (possibly as a temporary table) instead of managing their own caching solution.
  • Audit logs. If a plugin performs some sensitive operations, they can log usage info to _internal for others to audit later.
  • Long running process status. Many plugins (datasette-upload-csvs, datasette-litestream, datasette-socrata) perform tasks that run for a really long time, and want to give continue status updates to the user. They can store this info inside_internal
  • Safer authentication. Passwords and authentication plugins usually store credentials/hashed secrets in configuration files or environment variables, which can be difficult to handle. Now, they can store them in _internal

Proposal

  • We remove _internal from datasette.databases property.
  • We add new datasette.get_internal_db() method that returns the _internal database, for plugins to use
  • We add a new --internal internal.db flag. If provided, then the _internal DB will be sourced from that file, and further updates will be persisted to that file (instead of an in-memory database)
  • When creating internal.db, create a new _datasette_internal table to mark it a an "datasette internal database"
  • In datasette serve, we check for the existence of the _datasette_internal table. If it exists, we assume the user provided that file in error and raise an error. This is to limit the chance that someone accidentally publishes their internal database to the internet. We could optionally add a --unsafe-allow-internal flag (or database plugin) that allows someone to do this if they really want to.

New features unlocked with this

These features don't really need a standardized _internal table per-say (plugins could currently configure their own long-time storage features if they really wanted to), but it would make it much simpler to create these kinds of features with a persistent application database.

  • datasette-comments : A plugin for commenting on rows or specific values in a database. Comment contents + threads + email notification info can be stored in _internal
  • Bookmarks: "Bookmarking" an SQL query could be stored in _internal, or a URL link shortener
  • Webhooks: If a plugin wants to either consume a webhook or create a new one, they can store hashed credentials/API endpoints in _internal
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2157/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
594237015 MDU6SXNzdWU1OTQyMzcwMTU= 718 Plugin idea: datasette-redirects simonw 9599 open 0     0 2020-04-05T03:41:38Z 2023-08-30T22:17:31Z   OWNER  

I just had to write a one-off custom plugin to redirect niche-musems.com to www.niche-museums.com (https://github.com/simonw/museums/issues/21) - it would be great if this kind of thing could be handled by a configurable plugin.

https://github.com/simonw/museums/blob/6b1faf00c463b2228860d4d62d104b11935e01b1/plugins/redirect_www.py

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/718/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  reopened
1865649347 I_kwDOBm6k_c5vM4zD 2156 datasette -s/--setting option for setting nested configuration options simonw 9599 open 0     4 2023-08-24T18:09:27Z 2023-08-28T19:33:05Z   OWNER  

I've been thinking about what it might look like to allow command-line arguments to be used to define any of the configuration options in datasette.yml, as alternative and more convenient syntax.

Here's what I've come up with: datasette \ -s settings.sql_time_limit_ms 1000 \ -s plugins.datasette-auth-tokens.manage_tokens true \ -s plugins.datasette-auth-tokens.manage_tokens_database tokens \ mydatabase.db tokens.db Which would be equivalent to datasette.yml containing this: yaml plugins: datasette-auth-tokens: manage_tokens: true manage_tokens_database: tokens settings: sql_time_limit_ms: 1000 More details in https://github.com/simonw/datasette/issues/2143#issuecomment-1690792514

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2156/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
814595021 MDU6SXNzdWU4MTQ1OTUwMjE= 1241 Share button for copying current URL Kabouik 7107523 open 0     6 2021-02-23T15:55:40Z 2023-08-24T20:09:52Z   NONE  

I use datasette in an iframe inside another HTML file that contains other ways to represent my data (mostly leaflets maps built with R on summarized data), and the datasette iframe is a tab in that page.

This particular use prevents users to access the full URLs of their datasette views and queries, which is a shame because the way datasette handles URLs to make every view or query easy to share is awesome. I know how to get the URL from the context menu of my browser, but I don't think many visitors would do it or even notice that datasette uses permalinks for pretty much every action they do. Would it be possible to add a "Share link" button to the interface, either in datasette itself or in a plugin?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1241/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1855885427 I_kwDOBm6k_c5unpBz 2143 De-tangling Metadata before Datasette 1.0 asg017 15178711 open 0     24 2023-08-18T00:51:50Z 2023-08-24T18:28:27Z   CONTRIBUTOR  

Metadata in Datasette is a really powerful feature, but is a bit difficult to work with. It was initially a way to add "metadata" about your "data" in Datasette instances, like descriptions for databases/tables/columns, titles, source URLs, licenses, etc. But it later became the go-to spot for other Datasette features that have nothing to do with metadata, like permissions/plugins/canned queries.

Specifically, I've found the following problems when working with Datasette metadata:

  1. Metadata cannot be updated without re-starting the entire Datasette instance.
  2. The metadata.json/metadata.yaml has become a kitchen sink of unrelated (imo) features like plugin config, authentication config, canned queries
  3. The Python APIs for defining extra metadata are a bit awkward (the datasette.metadata() class, get_metadata() hook, etc.)

Possible solutions

Here's a few ideas of Datasette core changes we can make to address these problems.

Re-vamp the Datasette Python metadata APIs

The Datasette object has a single datasette.metadata() method that's a bit difficult to work with. There's also no Python API for inserted new metadata, so plugins have to rely on the get_metadata() hook.

The get_metadata() hook can also be improved - it doesn't work with async functions yet, so you're quite limited to what you can do.

(I'm a bit fuzzy on what to actually do here, but I imagine it'll be very small breaking changes to a few Python methods)

Add an optional datasette_metadata table

Datasette should detect and use metadata stored in a new special table called datasette_metadata. This would be a regular table that a user can edit on their own, and would serve as a "live updating" source of metadata, than can be changed while the Datasette instance is running.

Not too sure what the schema would look like, but I'd imagine:

sql CREATE TABLE datasette_metadata( level text, target any, key text, value any, primary key (level, target) )

Every row in this table would map to a single metadata "entry".

  • level would be one of "datasette", "database", "table", "column", which is the "level" the entry describes. For example, level="table" means it is metadata about a specific table, level="database" for a specific database, or level="datasette" for the entire Datasette instance.
  • target would "point" to the specific object the entry metadata is about, and would depend on what level is specific.
  • level="database": target would be the string name of the database that the metadata entry is about. ex "fixtures"
  • level="table": target would be a JSON array of two strings. The first element would be the database name, and the second would be the table name. ex ["fixtures", "students"]
  • level="column": target would be a JSON array of 3 strings: The database name, table name, and column name. Ex ["fixtures", "students", "student_id"]
  • key would be the type of metadata entry the row has, similar to the current "keys" that exist in metadata.json. Ex "about_url", "source", "description", etc
  • value would be the text value of be metadata entry. The literal text value of a description, about_url, column_label, etc

A quick sample:

level | target | key | value -- | -- | -- | -- datasette | NULL | title | my datasette title... db | fixtures | source | <description of my database source> table | ["fixtures", "students"] | label_column | student_name column | ["fixtures", "students", "birthdate"] | description | <description of the fixtures.students.birthdate column>

This datasette_metadata would be configured with other tools, and hopefully not manually by end users. Datasette Core could also offer a UI for editing entries in datasette_metadata, to update descriptions/columns on the fly.

Re-vamp metadata.json and move non-metadata config to another place

The motivation behind this is that it's awkward that metadata.json contains config about things that are not strictly metadata, including:

  • Plugin configuration
  • Authentication/permissions (ex the allow key on datasettes/databases/tables
  • Canned queries. might be controversial, but in my mind, canned queries are application-specific code and configuration, and don't describe the data that exists in SQLite databases.

I think we should move these outside of metadata.json and into a different file. The datasette.json idea in #2093 may be a good solution here: plugin/permissions/canned queries can be defined in datasette.json, while metadata.json/datasette_metadata will strictly be about documenting databases/tables/columns.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2143/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1858228057 I_kwDOBm6k_c5uwk9Z 2147 Plugin hook for database queries that are run jackowayed 18899 open 0     6 2023-08-20T18:43:50Z 2023-08-24T03:54:35Z   NONE  

I'm interested in making a plugin that saves every query that gets run to a table in the database. (I know about datasette-query-history but thought it would be good to have a server-side option.)

As far as I can tell reading the docs, there isn't really a hook setup to allow this.

Maybe I could hack it with some of the hooks that are passed requests, but that doesn't seem good.

I'm a little surprised this isn't possible, so I thought I would open an issue and see if that's a deeply considered decision or just "haven't needed it yet." I'm potentially interested in implementing the hook if the latter.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2147/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
459509126 MDU6SXNzdWU0NTk1MDkxMjY= 516 Enforce import sort order with isort simonw 9599 open 0     8 2019-06-22T20:35:50Z 2023-08-23T02:15:36Z   OWNER  

I want to use isort to order imports. A few steps here:

  • [x] Add a .isort.cfg file (see below)
  • [x] Use isort -rc to reformat existing code
  • [ ] Commit this change
  • [x] Add a unit test that ensures future changes remain isort compatible
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/516/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1857234285 I_kwDOBm6k_c5usyVt 2145 If a row has a primary key of `null` various things break simonw 9599 open 0     23 2023-08-18T20:06:28Z 2023-08-21T17:30:01Z   OWNER  

Stumbled across this while experimenting with datasette-write-ui. The error I got was a 500 on the /db page:

'NoneType' object has no attribute 'encode'

Tracked it down to this code, which assembles the URL for a row page:

https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/init.py#L120-L134

That's because tilde_encode can't handle None: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/init.py#L1175-L1178

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2145/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
476852861 MDU6SXNzdWU0NzY4NTI4NjE= 568 Add database_color as a configurable option LBHELewis 50906992 open 0     1 2019-08-05T13:14:45Z 2023-08-11T05:19:42Z   NONE  

This would be really useful as it would allow us to tie in with colour schemes.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/568/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1838469176 I_kwDOBm6k_c5tlNA4 2127 Context base class to support documenting the context simonw 9599 open 0   Datasette 1.0 3268330 3 2023-08-07T00:01:02Z 2023-08-10T01:30:25Z   OWNER  

This idea first came up here: - https://github.com/simonw/datasette/issues/2112#issuecomment-1652751140

If datasette.render_template(...) takes an optional Context subclass as an alternative to a context dictionary, I could then use dataclasses to define the context made available to specific templates - which then gives me something I can use to help document what they are.

Also refs: - https://github.com/simonw/datasette/issues/1510

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2127/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1843821954 I_kwDOBm6k_c5t5n2C 2137 Redesign row default JSON simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2023-08-09T18:49:11Z 2023-08-09T19:02:47Z   OWNER  

This URL here:

https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables

json { "database": "fixtures", "table": "simple_primary_key", "rows": [ { "id": "1", "content": "hello" } ], "columns": [ "id", "content" ], "primary_keys": [ "id" ], "primary_key_values": [ "1" ], "units": {}, "foreign_key_tables": [ { "other_table": "foreign_key_references", "column": "id", "other_column": "foreign_key_with_blank_label", "count": 0, "link": "/fixtures/foreign_key_references?foreign_key_with_blank_label=1" }, { "other_table": "foreign_key_references", "column": "id", "other_column": "foreign_key_with_label", "count": 1, "link": "/fixtures/foreign_key_references?foreign_key_with_label=1" }, { "other_table": "complex_foreign_keys", "column": "id", "other_column": "f3", "count": 1, "link": "/fixtures/complex_foreign_keys?f3=1" }, { "other_table": "complex_foreign_keys", "column": "id", "other_column": "f2", "count": 0, "link": "/fixtures/complex_foreign_keys?f2=1" }, { "other_table": "complex_foreign_keys", "column": "id", "other_column": "f1", "count": 1, "link": "/fixtures/complex_foreign_keys?f1=1" } ], "query_ms": 4.226590999678592, "source": "tests/fixtures.py", "source_url": "https://github.com/simonw/datasette/blob/main/tests/fixtures.py", "license": "Apache License 2.0", "license_url": "https://github.com/simonw/datasette/blob/main/LICENSE", "ok": true, "truncated": false }

That ?_extras= should be ?_extra= - plus the row JSON should be redesigned to fit the new default JSON representation.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2137/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1822939274 I_kwDOBm6k_c5sp9iK 2113 Implement and document extras for the new query view page simonw 9599 open 0   Datasette 1.0a-next 8755003 3 2023-07-26T18:24:01Z 2023-08-09T17:35:22Z   OWNER  
  • 2109

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2113/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1840417903 I_kwDOBm6k_c5tsoxv 2131 Refactor code that supports templates_considered comment simonw 9599 open 0   Datasette 1.0 3268330 1 2023-08-08T01:28:36Z 2023-08-09T15:27:41Z   OWNER  

I ended up duplicating it here: https://github.com/simonw/datasette/blob/7532feb424b1dce614351e21b2265c04f9669fe2/datasette/views/database.py#L164-L167

I think it should move to datasette.render_template() - and maybe have a renamed template variable too.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2131/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1840324765 I_kwDOBm6k_c5tsSCd 2129 CSV ?sql= should indicate errors simonw 9599 open 0   Datasette 1.0 3268330 1 2023-08-07T23:13:04Z 2023-08-08T02:02:21Z   OWNER  

https://latest.datasette.io/_memory.csv?sql=select+blah is a blank page right now:

bash curl -I 'https://latest.datasette.io/_memory.csv?sql=select+blah' HTTP/2 200 access-control-allow-origin: * access-control-allow-headers: Authorization, Content-Type access-control-expose-headers: Link access-control-allow-methods: GET, POST, HEAD, OPTIONS access-control-max-age: 3600 content-type: text/plain; charset=utf-8 x-databases: _memory, _internal, fixtures, fixtures2, extra_database, ephemeral date: Mon, 07 Aug 2023 23:12:15 GMT server: Google Frontend

Originally posted by @simonw in https://github.com/simonw/datasette/issues/2118#issuecomment-1668688947

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2129/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1824457306 I_kwDOBm6k_c5svwJa 2122 Parameters on canned queries: fixed or query-generated list? meowcat 1563881 open 0     0 2023-07-27T14:07:07Z 2023-07-27T14:07:07Z   NONE  

Hi,

currently parameters in canned queries are just text fields. It would be cool to have one of the options below. Would you accept a PR doing something in this direction? (Possibly this could even work as a plugin.)

  • adding facets, which would work like facets on tables or views, giving a list of selectable options (and leaving parameters as is)
  • making it possible to provide a query which returns selectable values for a parameter, e.g. calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: sql: "SELECT VALUE FROM generate_series(1, 30, 1)" # this obviously requires the corresponding sqlite extension instrument: sql: "SELECT DISTINCT MACHINE FROM calendar_entries"
  • making it possible to provide a fixed list of parameters calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: values: [1, 2, 3, 5, 10, 20, 30] instrument: values: [supermachine, crappymachine, boringmachine]
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2122/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1823428714 I_kwDOBm6k_c5sr1Bq 2120 Add __all__ to datasette/__init__.py simonw 9599 open 0     0 2023-07-27T01:07:10Z 2023-07-27T01:07:10Z   OWNER  

Currently looks like this: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/init.py#L1-L6

Adding __all__ = ["Permission", "Forbidden"...] would let me get rid of those # noqa comments.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2120/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1822813627 I_kwDOBm6k_c5spe27 2108 some (many?) SQL syntax errors are not throwing errors with a .csv endpoint fgregg 536941 open 0     0 2023-07-26T16:57:45Z 2023-07-26T16:58:07Z   CONTRIBUTOR  

here's a CTE query that should always fail with a syntax error:

sql with foo as (nonsense) select * from foo;

when we make this query against the default endpoint, we do indeed get a 400 status code the problem is returned to the user: https://global-power-plants.datasettes.com/global-power-plants?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B

but, if we use the csv endpoint, we get a 200 status code and no indication of a problem: https://global-power-plants.datasettes.com/global-power-plants.csv?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B

same with this bad sql

sql select a, from foo;

https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B

vs

https://global-power-plants.datasettes.com/global-power-plants.csv?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B

but, datasette catches this bad sql at both endpoints:

sql slect a from foo;

https://global-power-plants.datasettes.com/global-power-plants?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B https://global-power-plants.datasettes.com/global-power-plants.csv?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2108/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1811824307 I_kwDOBm6k_c5r_j6z 2105 When reverse proxying datasette with nginx an URL element gets erronously added aki-k 2235371 open 0     3 2023-07-19T12:16:53Z 2023-07-21T21:17:09Z   NONE  

I use this nginx config: ``` location /datasette-llm { return 302 /datasette-llm/; }

location /datasette-llm/ {
  proxy_set_header Upgrade           $http_upgrade;
  proxy_set_header Connection        "Upgrade";
  proxy_http_version 1.1;
  proxy_set_header X-Real-IP         $remote_addr;
  proxy_set_header X-Forwarded-For   $proxy_add_x_forwarded_for;
  proxy_set_header X-Forwarded-Proto https;
  proxy_set_header X-Forwarded-Host  $http_host;
  proxy_set_header Host              $host;
  proxy_max_temp_file_size           0;
  proxy_pass                         http://127.0.0.1:8001/datasette-llm/;
  proxy_redirect                     http:// https://;
  proxy_buffering off;
  proxy_request_buffering off;
  proxy_set_header Origin            '';
  client_max_body_size 0;
  auth_basic                         "datasette-llm";
  auth_basic_user_file               /etc/nginx/custom-userdb;
}

Then I start datasette with this command: datasette serve --setting base_url /datasette-llm/ $(llm logs path) ``` Everything else works right, except the links in "This data as json, CSV". They get an extra URL element "datasette-llm" like this:

https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.json?sql=select+*+from+_llm_migrations

https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.csv?sql=select+*+from+_llm_migrations&_size=max

When I remove that extra "datasette-llm" from the URL, those links work too.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2105/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1808215339 I_kwDOBm6k_c5rxy0r 2104 Tables starting with an underscore should be treated as hidden simonw 9599 open 0     2 2023-07-17T17:13:53Z 2023-07-18T22:41:37Z   OWNER  

Plugins can then take advantage of this pattern, for example: - https://github.com/simonw/datasette-auth-tokens/pull/8

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2104/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1808116827 I_kwDOBm6k_c5rxaxb 2103 data attribute on Datasette tables exposing the primary key of the row simonw 9599 open 0     0 2023-07-17T16:18:25Z 2023-07-17T16:18:25Z   OWNER  

Maybe put it on the <tr> but probably better to go on the td.type-pk.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2103/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1765870617 I_kwDOBm6k_c5pQQwZ 2087 `--settings settings.json` option simonw 9599 open 0     2 2023-06-20T17:48:45Z 2023-07-14T17:02:03Z   OWNER  

https://discord.com/channels/823971286308356157/823971286941302908/1120705940728066080

May I add a request to the whole metadata / settings ? Allow to pass --settings path/to/settings.json instead of having to rely exclusively on directory mode to centralize settings (this would reflect the behavior of providing metadata)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2087/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1803264272 I_kwDOBm6k_c5re6EQ 2101 alter: true support for JSON write API simonw 9599 open 0     1 2023-07-13T15:24:11Z 2023-07-13T15:24:18Z   OWNER  

Requested here: https://discord.com/channels/823971286308356157/823971286941302908/1129034187073134642

The former datasette-insert plugin had an option ?alter=1 to auto-add new columns. Does the JSON write API also have this?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2101/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1794097871 I_kwDOBm6k_c5q78LP 2095 Introduce "dark mode" CSS jamietanna 3315059 open 0     0 2023-07-07T19:15:58Z 2023-07-07T19:15:58Z   NONE  

Using the CSS media query prefers-color-scheme we can provide a dark-mode version of Datasette

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2095/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1783304750 I_kwDOBm6k_c5qSxIu 2094 JS Plugin Hooks for the Code Editor asg017 15178711 open 0     0 2023-07-01T00:51:57Z 2023-07-01T00:51:57Z   CONTRIBUTOR  

When #2052 merges, I'd like to add support to add extensions/functions to the Datasette code editor.

I'd eventually like to build a JS plugin for sqlite-docs, to add things like:

  • Inline documentation for tables/columns on hover
  • Inline docs for custom functions that are loaded in
  • More detailed autocomplete for tables/columns/functions

I did some hacking to see what this would look like, see here:

There can be a new hook that allows JS plugins to add new "extension" in the CodeMirror editorview here:

https://github.com/simonw/datasette/blob/8cd60fd1d899952f1153460469b3175465f33f80/datasette/static/cm-editor-6.0.1.js#L25

Will need some more planning. For example, the Codemirror bundle in Datasette has functions that we could re-export for plugins to use (so we don't load 2 version of "@codemirror/autocomplete", for example.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2094/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 1,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1781005740 I_kwDOBm6k_c5qJ_2s 2090 Adopt ruff for linting simonw 9599 open 0     2 2023-06-29T14:56:43Z 2023-06-29T15:05:04Z   OWNER  

https://beta.ruff.rs/docs/

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2090/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1054244712 I_kwDOBm6k_c4-1n9o 1510 Datasette 1.0 documented template context (maybe via API docs) simonw 9599 open 0   Datasette 1.0 3268330 3 2021-11-15T23:23:58Z 2023-06-28T02:05:21Z   OWNER  

Documented context plus protective unit tests. Goal is that custom templates built for 1.x will not break without a 2.x release.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1510/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
323223872 MDU6SXNzdWUzMjMyMjM4NzI= 260 Validate metadata.json on startup simonw 9599 open 0     7 2018-05-15T13:42:56Z 2023-06-21T12:51:22Z   OWNER  

It's easy to misspell the name of a database or table and then be puzzled when the metadata settings silently fail.

To avoid this, let's sanity check the provided metadata.json on startup and quit with a useful error message if we find any obvious mistakes.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/260/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1764792125 I_kwDOBm6k_c5pMJc9 2086 Show information on startup in directory configuration mode simonw 9599 open 0     0 2023-06-20T07:13:33Z 2023-06-20T07:13:33Z   OWNER  

https://discord.com/channels/823971286308356157/823971286941302908/1120516587036889098

One thing that would be helpful would be message at launch indicating a metadata.json is getting picked up. I'm using directory mode and was editing the wrong file for awhile before I realize nothing I was doing was having any effect.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2086/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1762180409 I_kwDOBm6k_c5pCL05 2085 Interactive row selection in Datasette learning4life 24938923 open 0     0 2023-06-18T08:29:45Z 2023-06-18T08:31:23Z   NONE  

Simon did a excellent prototype of an interactive row selection in Datasette.

I hope this functionality can be turned into a Datasette plugin.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2085/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1761613778 I_kwDOBm6k_c5pABfS 2084 Support facets for columns that contain timestamps devxpy 19492893 open 0     0 2023-06-17T03:33:54Z 2023-06-17T03:33:54Z   NONE  

Django has this very nice filter for datetime fields -

It would be nice to have something similar to facet by a field that contains a timestamp in datasette too - Which doesn't seem to do anything with timestamps right now...

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2084/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1727478903 I_kwDOBm6k_c5m9zx3 2081 Update Endpoints defined in metadata throws 403 Forbidden after a while cutmasta-kun 15085007 open 0     0 2023-05-26T11:52:30Z 2023-05-26T11:52:30Z   NONE  

Hello. I expose an endpoint to update tasks: { "title": "My Datasette Instance", "databases": { "tasks": { "queries": { "update_task": { "sql": "UPDATE tasks SET status = :status, result = :result, systemMessage = :systemMessage WHERE queueID = :queueID", "write": true, "on_success_message": "Task updated", "on_success_redirect": "/tasks/tasks.json", "on_error_message": "Task update failed", "on_error_redirect": "/tasks.json", "params": ["queueID", "taskData", "status", "result", "systemMessage"] } } } } }

This works really well! But after a while, the Datasette Instanz answers with 403 Forbidden. I have to delete the database and recreate it in order to work again.

Any help here? (´。_。`)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2081/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1708030220 I_kwDOBm6k_c5lznkM 2073 Faceting doesn't work against integer columns in views simonw 9599 open 0     2 2023-05-12T18:20:10Z 2023-05-12T18:24:07Z   OWNER  

Spotted this issue here: https://til.simonwillison.net/datasette/baseline

I had to do this workaround: sql create view baseline as select _key, spec, '' || json_extract(status, '$.is_baseline') as is_baseline, json_extract(status, '$.since') as baseline_since, json_extract(status, '$.support.chrome') as baseline_chrome, json_extract(status, '$.support.edge') as baseline_edge, json_extract(status, '$.support.firefox') as baseline_firefox, json_extract(status, '$.support.safari') as baseline_safari, compat_features, caniuse, usage_stats, status from [index] I think the core issue here is that, against a table, select * from x where integer_column = '1' works correctly, due to some kind of column type conversion mechanism... but this mechanism doesn't work against views.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2073/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1698865182 I_kwDOBm6k_c5lQqAe 2069 [BUG] Cannot insert new data to deployed instance yqlbu 31861128 open 0     1 2023-05-07T02:59:42Z 2023-05-07T03:17:35Z   NONE  

Summary

Recently, I deployed an instance of datasette to Vercel with the following plugins:

  • datasette-auth-tokens
  • datasette-insert

With the above plugins, I was able to insert new data to local sqlite db. However, when it comes to the deployment on Vercel, things behave differently. I observed some errors from the logs console on Vercel:

console File "/var/task/datasette/database.py", line 179, in _execute_writes conn = self.connect(write=True) File "/var/task/datasette/database.py", line 93, in connect assert not (write and not self.is_mutable) AssertionError

I think it is a potential bug.

Reproduce

metadata.json
```json { "plugins": { "datasette-insert": { "allow": { "id": "*" } }, "datasette-auth-tokens": { "tokens": [ { "token": { "$env": "INSERT_TOKEN" }, "actor": { "id": "repeater" } } ], "param": "_auth_token" } } } ```
commands
```bash # deploy datasette publish vercel remote.db \ --project=repeater-bot-sqlite \ --metadata metadata.json \ --install datasette-auth-tokens \ --install datasette-insert \ --vercel-json=vercel.json # test insert cat fixtures/dogs.json | curl --request POST -d @- -H "Authorization: Bearer <token>" \ 'https://repeater-bot-sqlite.vercel.app/-/insert/remote/dogs?pk=id' ```
logs
```console Traceback (most recent call last): File "/var/task/datasette/app.py", line 1354, in route_path response = await view(request, send) File "/var/task/datasette/app.py", line 1500, in async_view_fn response = await async_call_with_supported_arguments( File "/var/task/datasette/utils/__init__.py", line 1005, in async_call_with_supported_arguments return await fn(*call_with) File "/var/task/datasette_insert/__init__.py", line 14, in insert_or_upsert response = await insert_or_upsert_implementation(request, datasette) File "/var/task/datasette_insert/__init__.py", line 91, in insert_or_upsert_implementation table_count = await db.execute_write_fn(write_in_thread, block=True) File "/var/task/datasette/database.py", line 167, in execute_write_fn raise result File "/var/task/datasette/database.py", line 179, in _execute_writes conn = self.connect(write=True) File "/var/task/datasette/database.py", line 93, in connect assert not (write and not self.is_mutable) AssertionError ```
datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2069/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1690765434 I_kwDOBm6k_c5kxwh6 2067 Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6 justmars 39538958 open 0     1 2023-05-01T12:42:28Z 2023-05-03T00:16:03Z   NONE  

Hi! Wondering if this issue is limited to my local system or if it affects others as well.

It seems like 3.11 errors out on a "litestream-restored" database. On further investigation, it also appears to conk out on 3.10.8 but works on 3.10.7 and 3.10.6.

To demo issue I created a test database, replicated it to an aws s3 bucket, then restored the same under various .pyenv-versioned shells where I test whether I can read the database via the sqlite3 cli.

```sh

create new shell with 3.11.3

litestream restore -o data/db.sqlite s3://mytestbucketxx/db sqlite3 data/db.sqlite

SQLite version 3.41.2 2023-03-22 11:56:21

Enter ".help" for usage hints.

sqlite> .tables

_litestream_lock _litestream_seq movie

sqlite>

```

However this get me an OperationalError when reading via datasette:

Error on 3.11.3 and 3.10.8 ```sh datasette data/db.sqlite ``` ```console /tester/.venv/lib/python3.11/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning) Traceback (most recent call last): File "/tester/.venv/bin/datasette", line 8, in <module> sys.exit(cli()) ^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/click/core.py", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/cli.py", line 143, in wrapped return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/cli.py", line 615, in serve asyncio.get_event_loop().run_until_complete(check_databases(ds)) File "/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/cli.py", line 660, in check_databases await database.execute_fn(check_connection) File "/tester/.venv/lib/python3.11/site-packages/datasette/database.py", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/database.py", line 211, in in_thread return fn(conn) ^^^^^^^^ File "/tester/.venv/lib/python3.11/site-packages/datasette/utils/__init__.py", line 951, in check_connection for r in conn.execute( ^^^^^^^^^^^^^ sqlite3.OperationalError: unable to open database file ```
Works on 3.10.7, 3.10.6 ```sh # create new shell with 3.10.7 / 3.10.6 litestream restore -o data/db.sqlite s3://mytestbucketxx/db datasette data/db.sqlite # ... # INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ```

In both scenarios, the only dependencies were the pinned python version and the latest Datasette version 0.64.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2067/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1663399821 I_kwDOBm6k_c5jJXeN 2058 500 "attempt to write a readonly database" error caused by "PRAGMA schema_version" simonw 9599 open 0     9 2023-04-11T23:57:50Z 2023-04-13T16:35:21Z   OWNER  

I've not been able to replicate this myself yet, but I've seen log files from a user affected by it.

File "/usr/local/lib/python3.11/site-packages/datasette/views/base.py", line 89, in dispatch_request await self.ds.refresh_schemas() File "/usr/local/lib/python3.11/site-packages/datasette/app.py", line 371, in refresh_schemas await self._refresh_schemas() File "/usr/local/lib/python3.11/site-packages/datasette/app.py", line 386, in _refresh_schemas schema_version = (await db.execute("PRAGMA schema_version")).first()[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/datasette/database.py", line 267, in execute results = await self.execute_fn(sql_operation_in_thread) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/datasette/database.py", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/datasette/database.py", line 211, in in_thread return fn(conn) ^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/datasette/database.py", line 237, in sql_operation_in_thread cursor.execute(sql, params if params is not None else {}) sqlite3.OperationalError: attempt to write a readonly database That's running the official Datasette Docker image on https://fly.io/ - it's causing 500 errors on every page of their site.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2058/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1665510265 I_kwDOBm6k_c5jRat5 2060 Clean up a bunch of warnings from ruff simonw 9599 open 0     0 2023-04-13T01:23:02Z 2023-04-13T01:23:02Z   OWNER  

See: - #2056

ruff spots a bunch of warnings about things like unused variables - would be good to clean up as many of these as possible.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2060/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1665053646 I_kwDOBm6k_c5jPrPO 2059 "Deceptive site ahead" alert on Heroku deployment mtdukes 1186275 open 0     1 2023-04-12T18:34:51Z 2023-04-13T01:13:01Z   NONE  

I deployed a fairly basic instance of Datasette (datasette-auth-passwords is the only plugin) using Heroku. The deployed URL now gives a "Deceptive site ahead" warning to users.

Is there way around this? Maybe a way to add ownership verification through Google's search console?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2059/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1657861026 I_kwDOBm6k_c5i0POi 2054 Make detailed notes on how table, query and row views work right now simonw 9599 open 0     13 2023-04-06T18:21:09Z 2023-04-07T20:14:38Z   OWNER  

Research to help influence the following: - #2049 - #2053 - #2050 - #262

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2054/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1646734246 I_kwDOBm6k_c5iJyum 2049 Custom SQL queries should use new JSON ?_extra= format simonw 9599 open 0   Datasette 1.0a-next 8755003 4 2023-03-30T00:42:53Z 2023-04-05T23:29:27Z   OWNER  

Related: - #262

I've made the change to the table view, now I need the new format to work for arbitrary SQL queries too.

Note that this incorporates both arbitrary SQL queries and canned queries.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2049/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1649791661 I_kwDOBm6k_c5iVdKt 2050 Row page JSON should use new ?_extra= format simonw 9599 open 0   Datasette 1.0a-next 8755003 1 2023-03-31T17:56:53Z 2023-03-31T17:59:49Z   OWNER  

https://latest.datasette.io/fixtures/facetable/2.json

Related: - #2049 - #1709

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2050/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1649793525 I_kwDOBm6k_c5iVdn1 2051 `?_extra=row_urls` for table pages simonw 9599 open 0     0 2023-03-31T17:58:36Z 2023-03-31T17:58:36Z   OWNER  

Provides URLs to the JSON version of those rows. Maybe it persists the ?_shape= option too? Not sure about that.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2051/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1615692818 I_kwDOBm6k_c5gTYQS 2035 Potential feature: special support for `?a=1&a=2` on the query page simonw 9599 open 0   Datasette 1.0 3268330 14 2023-03-08T18:05:03Z 2023-03-31T16:09:08Z   OWNER  

From a discussion on Discord: https://discord.com/channels/823971286308356157/996877076982415491/1082789517062320138

The key idea is to make it easier for people to implement where id in (...) that's populated from query string arguments.

What if you could add ?id=11&id=32&id=62 to the URL and have that made available as a list that can be used in the query?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2035/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1531991339 I_kwDOBm6k_c5bUFUr 1989 Suggestion: Hiding columns pax 116795 open 0     3 2023-01-13T09:33:32Z 2023-03-31T06:18:05Z   NONE  

As there's the possibility of hiding tables - I've run into the need of hiding specific columns - data that's either not relevant for public or can't be shown due to privacy reasons.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1989/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1646068413 I_kwDOBm6k_c5iHQK9 2048 Test failures encountered while packaging for GNU Guix Apteryks 8332263 open 0     0 2023-03-29T15:36:54Z 2023-03-29T15:36:54Z   NONE  

Hello,

While reviewing a packaged submitted to Guix to add datasette, the test suite produces the following errors: ``` =================================== FAILURES =================================== ____ testrow_strange_table_name ______ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffef099be0>

def test_row_strange_table_name(app_client):
    response = app_client.get(
        "/fixtures/table~2Fwith~2Fslashes~2Ecsv/3.json?_shape=objects"
    )
  assert response.status == 200

E assert 400 == 200 E + where 400 = <datasette.utils.testing.TestResponse object at 0x7fffef236a30>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:701: AssertionError ----------------------------- Captured stderr call ----------------------------- ERROR: conn=<sqlite3.Connection object at 0x7fffeedfe5d0>, sql = 'select rowid, * from [table%7E2Fwith%7E2Fslashes%7E2Ecsv] where "rowid"=:p0', params = {'p0': '3'}: no such table: table%7E2Fwith%7E2Fslashes%7E2Ecsv __ test_database_page_for_database_with_dot_in_name __ [gw15] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_with_dot = <datasette.utils.testing.TestClient object at 0x7fffef3416a0>

def test_database_page_for_database_with_dot_in_name(app_client_with_dot):
    response = app_client_with_dot.get("/fixtures~2Edot.json")
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffef089fa0>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:633: AssertionError ___ test_tilde_encoded_database_names[fo%o] ______ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

db_name = 'fo%o'

@pytest.mark.asyncio
@pytest.mark.parametrize("db_name", ("foo", r"fo%o", "f~/c.d"))
async def test_tilde_encoded_database_names(db_name):
    ds = Datasette()
    ds.add_memory_database(db_name)
    response = await ds.client.get("/.json")
    assert db_name in response.json().keys()
    path = response.json()[db_name]["path"]
    # And the JSON for that database
    response2 = await ds.client.get(path + ".json")
  assert response2.status_code == 200

E assert 302 == 200 E + where 302 = <Response [302 Found]>.status_code

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError ___ testtilde_encoded_database_names[f~/c.d] _____ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

db_name = 'f~/c.d'

@pytest.mark.asyncio
@pytest.mark.parametrize("db_name", ("foo", r"fo%o", "f~/c.d"))
async def test_tilde_encoded_database_names(db_name):
    ds = Datasette()
    ds.add_memory_database(db_name)
    response = await ds.client.get("/.json")
    assert db_name in response.json().keys()
    path = response.json()[db_name]["path"]
    # And the JSON for that database
    response2 = await ds.client.get(path + ".json")
  assert response2.status_code == 200

E assert 302 == 200 E + where 302 = <Response [302 Found]>.status_code

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError __ test_database_with_space_in_name[/searchable.json] __ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef11d730> path = '/searchable.json'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffef11d940> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E2Ejson')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___ test_database_with_space_in_name[.json] ______ [gw19] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef085a90> path = '.json'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffecd99ca0> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E2Ejson')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects __ test_database_with_space_in_name[/searchable_view] __ [gw22] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffeeab4c70> path = '/searchable_view'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffec5b3580> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable_view')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/client.py:1672: TooManyRedirects ___ test_database_with_space_in_name[/] ___ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef085be0> path = '/'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffec5f1370> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ____ test_database_with_space_in_name[/searchable] _____ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef099f10> path = '/searchable'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffecd8c790> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects __ testdatabase_with_space_in_name[/searchable_view.json] ____ [gw23] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef341520> path = '/searchable_view.json'

@pytest.mark.parametrize(
    "path",
    (
        "/",
        ".json",
        "/searchable",
        "/searchable.json",
        "/searchable_view",
        "/searchable_view.json",
    ),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
  response = app_client_two_attached_databases.get(
        "/extra~20database" + path, follow_redirects=True
    )

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:


/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects(


self = <httpx.AsyncClient object at 0x7fffef085460> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable_view%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E2Ejson')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]

async def _send_handling_redirects(
    self,
    request: Request,
    follow_redirects: bool,
    history: typing.List[Response],
) -> Response:
    while True:
        if len(history) > self.max_redirects:
          raise TooManyRedirects(
                "Exceeded maximum allowed redirects.", request=request
            )

E httpx.TooManyRedirects: Exceeded maximum allowed redirects.

/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects __ test_weird_database_names[database (1).sqlite] __ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw7/test_weird_database_names_data0') filename = 'database (1).sqlite'

@pytest.mark.parametrize(
    "filename", ["test-database (1).sqlite", "database (1).sqlite"]
)
def test_weird_database_names(tmpdir, filename):
    # https://github.com/simonw/datasette/issues/1181
    runner = CliRunner()
    db_path = str(tmpdir / filename)
    sqlite3.connect(db_path).execute("vacuum")
    result1 = runner.invoke(cli, [db_path, "--get", "/"])
    assert result1.exit_code == 0, result1.output
    filename_no_stem = filename.rsplit(".", 1)[0]
    expected_link = '<a href="/{}">{}</a>'.format(
        tilde_encode(filename_no_stem), filename_no_stem
    )
    assert expected_link in result1.output
    # Now try hitting that database page
    result2 = runner.invoke(
        cli, [db_path, "--get", "/{}".format(tilde_encode(filename_no_stem))]
    )
  assert result2.exit_code == 0, result2.output

E AssertionError: E
E assert 1 == 0 E + where 1 = <Result SystemExit(1)>.exit_code

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError ___ test_weird_database_names[test-database (1).sqlite] ______ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw6/test_weird_database_names_test0') filename = 'test-database (1).sqlite'

@pytest.mark.parametrize(
    "filename", ["test-database (1).sqlite", "database (1).sqlite"]
)
def test_weird_database_names(tmpdir, filename):
    # https://github.com/simonw/datasette/issues/1181
    runner = CliRunner()
    db_path = str(tmpdir / filename)
    sqlite3.connect(db_path).execute("vacuum")
    result1 = runner.invoke(cli, [db_path, "--get", "/"])
    assert result1.exit_code == 0, result1.output
    filename_no_stem = filename.rsplit(".", 1)[0]
    expected_link = '<a href="/{}">{}</a>'.format(
        tilde_encode(filename_no_stem), filename_no_stem
    )
    assert expected_link in result1.output
    # Now try hitting that database page
    result2 = runner.invoke(
        cli, [db_path, "--get", "/{}".format(tilde_encode(filename_no_stem))]
    )
  assert result2.exit_code == 0, result2.output

E AssertionError: E
E assert 1 == 0 E + where 1 = <Result SystemExit(1)>.exit_code

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _ test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] _ [gw11] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffec2d37f0> path = '/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd' expected = [['<td class="col-pk1 type-str">a/b</td>', '<td class="col-pk2 type-str">.c-d</td>', '<td class="col-content type-str">c</td>']]

@pytest.mark.parametrize(
    "path,expected",
    (
        (
            "/fixtures/compound_primary_key/a,b",
            [
                [
                    '<td class="col-pk1 type-str">a</td>',
                    '<td class="col-pk2 type-str">b</td>',
                    '<td class="col-content type-str">c</td>',
                ]
            ],
        ),
        (
            "/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd",
            [
                [
                    '<td class="col-pk1 type-str">a/b</td>',
                    '<td class="col-pk2 type-str">.c-d</td>',
                    '<td class="col-content type-str">c</td>',
                ]
            ],
        ),
    ),
)
def test_row_html_compound_primary_key(app_client, path, expected):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffd47c23a0>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:370: AssertionError _ test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffd4743fa0> path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_classes = ['table', 'db-fixtures', 'table-tablewithslashescsv-fa7563']

@pytest.mark.parametrize(
    "path,expected_classes",
    [
        ("/", ["index"]),
        ("/fixtures", ["db", "db-fixtures"]),
        ("/fixtures?sql=select+1", ["query", "db-fixtures"]),
        (
            "/fixtures/simple_primary_key",
            ["table", "db-fixtures", "table-simple_primary_key"],
        ),
        (
            "/fixtures/neighborhood_search",
            ["query", "db-fixtures", "query-neighborhood_search"],
        ),
        (
            "/fixtures/table~2Fwith~2Fslashes~2Ecsv",
            ["table", "db-fixtures", "table-tablewithslashescsv-fa7563"],
        ),
        (
            "/fixtures/simple_primary_key/1",
            ["row", "db-fixtures", "table-simple_primary_key"],
        ),
    ],
)
def test_css_classes_on_body(app_client, path, expected_classes):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffd4628dc0>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:238: AssertionError _ test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffd4743fa0> path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_considered = 'table-fixtures-tablewithslashescsv-fa7563.html, *table.html'

@pytest.mark.parametrize(
    "path,expected_considered",
    [
        ("/", "*index.html"),
        ("/fixtures", "database-fixtures.html, *database.html"),
        (
            "/fixtures/simple_primary_key",
            "table-fixtures-simple_primary_key.html, *table.html",
        ),
        (
            "/fixtures/table~2Fwith~2Fslashes~2Ecsv",
            "table-fixtures-tablewithslashescsv-fa7563.html, *table.html",
        ),
        (
            "/fixtures/simple_primary_key/1",
            "row-fixtures-simple_primary_key.html, *row.html",
        ),
    ],
)
def test_templates_considered(app_client, path, expected_considered):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffd44f2d60>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:264: AssertionError _ test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] _ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffecd9fac0> path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected = 'http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json'

@pytest.mark.parametrize(
    "path,expected",
    (
        # Instance index page
        ("/", "http://localhost/.json"),
        # Table page
        ("/fixtures/facetable", "http://localhost/fixtures/facetable.json"),
        (
            "/fixtures/table~2Fwith~2Fslashes~2Ecsv",
            "http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json",
        ),
        # Row page
        (
            "/fixtures/no_primary_key/1",
            "http://localhost/fixtures/no_primary_key/1.json",
        ),
        # Database index page
        (
            "/fixtures",
            "http://localhost/fixtures.json",
        ),
        # Custom query page
        (
            "/fixtures?sql=select+*+from+facetable",
            "http://localhost/fixtures.json?sql=select+*+from+facetable",
        ),
        # Canned query page
        (
            "/fixtures/neighborhood_search?text=town",
            "http://localhost/fixtures/neighborhood_search.json?text=town",
        ),
        # /-/ pages
        (
            "/-/plugins",
            "http://localhost/-/plugins.json",
        ),
    ),
)
def test_alternate_url_json(app_client, path, expected):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffd4650b80>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:948: AssertionError _ test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] _ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffec5952e0> path = '/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC' expected = '/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B'

@pytest.mark.parametrize(
    "path,expected",
    [
        (
            "/fixtures/neighborhood_search",
            "/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&amp;text=",
        ),
        (
            "/fixtures/neighborhood_search?text=ber",
            "/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&amp;text=ber",
        ),
        ("/fixtures/pragma_cache_size", None),
        (
            # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬
            "/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC",
            "/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B",
        ),
        ("/fixtures/magic_parameters", None),
    ],
)
def test_edit_sql_link_on_canned_queries(app_client, path, expected):
    response = app_client.get(path)
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffec4a6f70>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:841: AssertionError _____ test_table_with_slashes_in_name ______ [gw9] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffec0860a0>

def test_table_with_slashes_in_name(app_client):
    response = app_client.get(
        "/fixtures/table~2Fwith~2Fslashes~2Ecsv.json?_shape=objects"
    )
  assert response.status == 200

E assert 302 == 200 E + where 302 = <datasette.utils.testing.TestResponse object at 0x7fffec086fa0>.status

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:141: AssertionError ___ testcustom_query_with_unicode_characters _____ [gw8] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffec1cda90>

def test_custom_query_with_unicode_characters(app_client):
    # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬.json
    response = app_client.get(
        "/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC.json?_shape=array"
    )
  assert [{"id": 1, "name": "San Francisco"}] == response.json

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:1042:


/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:40: in json return json.loads(self.text) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/init.py:346: in loads return _default_decoder.decode(s) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:337: in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end())


self = <json.decoder.JSONDecoder object at 0x7ffff7479760>, s = '', idx = 0

def raw_decode(self, s, idx=0):
    """Decode a JSON document from ``s`` (a ``str`` beginning with
    a JSON document) and return a 2-tuple of the Python
    representation and the index in ``s`` where the document ended.

    This can be used to decode a JSON document from a string that may
    have extraneous data at the end.

    """
    try:
        obj, end = self.scan_once(s, idx)
    except StopIteration as err:
      raise JSONDecodeError("Expecting value", s, err.value) from None

E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:355: JSONDecodeError _ test_searchable[/fixtures/searchable.json?_search=te+AND+do&_searchmode=raw-expected_rows3] _ [gw13] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

app_client = <datasette.utils.testing.TestClient object at 0x7fffd470bf10> path = '/fixtures/searchable.json?_search=te+AND+do&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]

@pytest.mark.parametrize(
    "path,expected_rows",
    [
        (
            "/fixtures/searchable.json?_search=dog",
            [
                [1, "barry cat", "terry dog", "panther"],
                [2, "terry dog", "sara weasel", "puma"],
            ],
        ),
        (
            # Special keyword shouldn't break FTS query
            "/fixtures/searchable.json?_search=AND",
            [],
        ),
        (
            # Without _searchmode=raw this should return no results
            "/fixtures/searchable.json?_search=te*+AND+do*",
            [],
        ),
        (
            # _searchmode=raw
            "/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw",
            [
                [1, "barry cat", "terry dog", "panther"],
                [2, "terry dog", "sara weasel", "puma"],
            ],
        ),
        (
            # _searchmode=raw combined with _search_COLUMN
            "/fixtures/searchable.json?_search_text2=te*&_searchmode=raw",
            [
                [1, "barry cat", "terry dog", "panther"],
            ],
        ),
        (
            "/fixtures/searchable.json?_search=weasel",
            [[2, "terry dog", "sara weasel", "puma"]],
        ),
        (
            "/fixtures/searchable.json?_search_text2=dog",
            [[1, "barry cat", "terry dog", "panther"]],
        ),
        (
            "/fixtures/searchable.json?_search_name%20with%20.%20and%20spaces=panther",
            [[1, "barry cat", "terry dog", "panther"]],
        ),
    ],
)
def test_searchable(app_client, path, expected_rows):
    response = app_client.get(path)
  assert expected_rows == response.json["rows"]

E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ]

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:402: AssertionError _ test_searchmode[table_metadata1-_search=te+AND+do-expected_rows1] ____ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

table_metadata = {'searchmode': 'raw'}, querystring = '_search=te+AND+do' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]

@pytest.mark.parametrize(
    "table_metadata,querystring,expected_rows",
    [
        (
            {},
            "_search=te*+AND+do*",
            [],
        ),
        (
            {"searchmode": "raw"},
            "_search=te*+AND+do*",
            _SEARCHMODE_RAW_RESULTS,
        ),
        (
            {},
            "_search=te*+AND+do*&_searchmode=raw",
            _SEARCHMODE_RAW_RESULTS,
        ),
        # Can be over-ridden with _searchmode=escaped
        (
            {"searchmode": "raw"},
            "_search=te*+AND+do*&_searchmode=escaped",
            [],
        ),
    ],
)
def test_searchmode(table_metadata, querystring, expected_rows):
    with make_app_client(
        metadata={"databases": {"fixtures": {"tables": {"searchable": table_metadata}}}}
    ) as client:
        response = client.get("/fixtures/searchable.json?" + querystring)
      assert expected_rows == response.json["rows"]

E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ]

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError _ test_searchmode[table_metadata2-_search=te+AND+do&_searchmode=raw-expected_rows2] _ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python

table_metadata = {}, querystring = '_search=te+AND+do&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]

@pytest.mark.parametrize(
    "table_metadata,querystring,expected_rows",
    [
        (
            {},
            "_search=te*+AND+do*",
            [],
        ),
        (
            {"searchmode": "raw"},
            "_search=te*+AND+do*",
            _SEARCHMODE_RAW_RESULTS,
        ),
        (
            {},
            "_search=te*+AND+do*&_searchmode=raw",
            _SEARCHMODE_RAW_RESULTS,
        ),
        # Can be over-ridden with _searchmode=escaped
        (
            {"searchmode": "raw"},
            "_search=te*+AND+do*&_searchmode=escaped",
            [],
        ),
    ],
)
def test_searchmode(table_metadata, querystring, expected_rows):
    with make_app_client(
        metadata={"databases": {"fixtures": {"tables": {"searchable": table_metadata}}}}
    ) as client:
        response = client.get("/fixtures/searchable.json?" + querystring)
      assert expected_rows == response.json["rows"]

E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ]

/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError =========================== short test summary info ============================ FAILED tests/test_api.py::test_row_strange_table_name - assert 400 == 200 FAILED tests/test_api.py::test_database_page_for_database_with_dot_in_name - ... FAILED tests/test_api.py::test_tilde_encoded_database_names[fo%o] - assert 30... FAILED tests/test_api.py::test_tilde_encoded_database_names[f~/c.d] - assert ... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable.json] FAILED tests/test_api.py::test_database_with_space_in_name[.json] - httpx.Too... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view] FAILED tests/test_api.py::test_database_with_space_in_name[/] - httpx.TooMany... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable] - htt... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view.json] FAILED tests/test_cli.py::test_weird_database_names[database (1).sqlite] - As... FAILED tests/test_cli.py::test_weird_database_names[test-database (1).sqlite] FAILED tests/test_html.py::test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] FAILED tests/test_html.py::test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] FAILED tests/test_html.py::test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, table.html] FAILED tests/test_html.py::test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] FAILED tests/test_html.py::test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] FAILED tests/test_table_api.py::test_table_with_slashes_in_name - assert 302 ... FAILED tests/test_table_api.py::test_custom_query_with_unicode_characters - j... FAILED tests/test_table_api.py::test_searchable[/fixtures/searchable.json?_search=te+AND+do&_searchmode=raw-expected_rows3] FAILED tests/test_table_api.py::test_searchmode[table_metadata1-_search=te+AND+do-expected_rows1] FAILED tests/test_table_api.py::test_searchmode[table_metadata2-_search=te+AND+do*&_searchmode=raw-expected_rows2] =========== 22 failed, 1049 passed, 3 skipped in 1522.28s (0:25:22) ============ error: in phase 'check': uncaught exception: %exception #<&invoke-error program: "/gnu/store/ziqwkzz6znb5d3c245xn0cq5ra2ly0w3-python-pytest-7.1.3/bin/pytest" arguments: ("-vv" "-n" "24" "-m" "not serial") exit-status: 1 term-signal: #f stop-signal: #f> phase `check' failed after 1523.3 seconds The tests run in a private namespace without internet connectivity, and the Python dependencies are at: python-aiofiles@0.6.0 python-asgi-csrf@0.9 python-asgiref@3.4.1 + python-beautifulsoup4@4.11.1 python-black@22.3.0 python-click-default-group@1.2.2 python-click@8.1.3 + python-cogapp@3.3.0 python-httpx@0.23.0 python-hupper@1.10.3 python-itsdangerous@2.0.1 + python-janus@1.0.0 python-jinja2@3.1.1 python-mergedeep@1.3.4 python-pint@0.20.1 python-pluggy@1.0.0 + python-pytest-asyncio@0.17.2 python-pytest-runner@5.2 python-pytest-timeout@2.0.2 + python-pytest-xdist@2.5.0 python-pytest@7.1.3 python-pyyaml@6.0 python-setuptools@64.0.3 + python-trustme@0.9.0 python-uvicorn@0.17.6 ``` With Python 3.9.9.

Thank you!

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2048/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
323658641 MDU6SXNzdWUzMjM2NTg2NDE= 262 Add ?_extra= mechanism for requesting extra properties in JSON simonw 9599 open 0   Datasette 1.0 3268330 27 2018-05-16T14:55:42Z 2023-03-29T06:22:22Z   OWNER  

Datasette views currently work by creating a set of data that should be returned as JSON, then defining an additional, optional template_data() function which is called if the view is being rendered as HTML.

This template_data() function calculates extra template context variables which are necessary for the HTML view but should not be included in the JSON.

Example of how that is used today: https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/table.py#L672-L704

With features like Facets in #255 I'm beginning to want to move more items into the template_data() - in the case of facets it's the suggested_facets array. This saves that feature from being calculated (involving several SQL queries) for the JSON case where it is unlikely to be used.

But... as an API user, I want to still optionally be able to access that information.

Solution: Add a ?_extra=suggested_facets&_extra=table_metadata argument which can be used to optionally request additional blocks to be added to the JSON API.

Then redefine as many of the current template_data() features as extra arguments instead, and teach Datasette to return certain extras by default when rendering templates.

This could allow the JSON representation to be slimmed down further (removing e.g. the table_definition and view_definition keys) while still making that information available to API users who need it.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/262/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1641013220 I_kwDOBm6k_c5hz9_k 2045 First column on a view page has no facet option in cog menu simonw 9599 open 0   Datasette 1.0 3268330 0 2023-03-26T18:02:47Z 2023-03-26T18:02:48Z   OWNER  

e.g. first column on this page - cog menu has no option to facet.

https://datasette.io/content/tools

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2045/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1121583414 I_kwDOBm6k_c5C2gE2 1619 JSON link on row page is 404 if base_url setting is used simonw 9599 open 0     5 2022-02-02T07:09:53Z 2023-03-24T15:38:04Z   OWNER  

On my local environment:

datasette fixtures.db -p 3344 --setting base_url /foo/bar/

Then hit http://127.0.0.1:3344/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3

But... that json link goes here, which is a 404:

http://127.0.0.1:3344/foo/bar/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3?_format=json

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1619/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1636616315 I_kwDOBm6k_c5hjMh7 2042 Gather feedback on new ?_extra= design simonw 9599 open 0     0 2023-03-22T23:07:43Z 2023-03-22T23:08:19Z   OWNER  

Now that I've landed: - #1999

See also: - #262

I want to get some feedback from people on the design of the new ?_extra= feature, before freezing it into Datasette 1.0.

The big change is that the default JSON representation is now MUCH slimmer - it only gives you keys for "next" and "rows", where rows is a list of JSON objects (not a list of arrays as was previously the default) - for example https://latest.datasette.io/fixtures/sortable.json

If you want extra stuff you can ask for it with the new ?_extra= parameter - e.g. https://latest.datasette.io/fixtures/sortable.json?_extra=columns&_extra=suggested_facets

You can use ?_extra=extras to see a list of available extras: https://latest.datasette.io/fixtures/sortable.json?_extra=extras

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2042/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1620515757 I_kwDOBm6k_c5glxut 2039 Subtle bug with `--load-extension` and `--static` flags with absolute Windows paths with`C:\` asg017 15178711 open 0     0 2023-03-12T21:18:52Z 2023-03-12T21:18:52Z   CONTRIBUTOR  

From the Datasette discord: A user tried running the following command on windows:

datasette --load-extension="C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll" This failed with "The specified module could not be found", because the entrypoint option introduced in #1789 splits the input differently. Instead of loading the extension found at "C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll", it instead tried to load the extension at "C" with entrypoint `"\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll".

This is hard because most absolute windows paths have a colon in them, like C:\foo.txt or D:\bar.txt. I'd image the --static flag is also vulnerable to this type of bug.

The "solution" is to use a relative path instead, but that doesn't feel that great.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2039/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
317001500 MDU6SXNzdWUzMTcwMDE1MDA= 236 datasette publish lambda plugin simonw 9599 open 0     11 2018-04-23T22:10:30Z 2023-03-12T14:04:15Z   OWNER  

Refs #217 - create a publish plugin that can deploy to AWS Lambda.

https://docs.aws.amazon.com/lambda/latest/dg/limits.html says lambda packages can be up to 50 MB, so this would only work with smaller databases (the command can check the filesize before attempting to package and deploy it).

Lambdas do get a 512 MB /tmp directory too, so for larger databases the function could start and then download up to 512MB from an S3 bucket - so the plugin could take an optional S3 bucket to write to and know how to upload the .db file there and then have the lambda download it on startup.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/236/reactions",
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1618249044 I_kwDOBm6k_c5gdIVU 2038 Consider a `strict_templates` setting simonw 9599 open 0     2 2023-03-10T02:09:13Z 2023-03-10T02:11:06Z   OWNER  

A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error.

Prototype here: diff diff --git a/datasette/app.py b/datasette/app.py index 40416713..1428a3f0 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -200,6 +200,7 @@ SETTINGS = ( "Allow display of SQL trace debug information with ?_trace=1", ), Setting("base_url", "/", "Datasette URLs should use this base path"), + Setting("strict_templates", False, "Raise errors for undefined template variables"), ) _HASH_URLS_REMOVED = "The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead" OBSOLETE_SETTINGS = { @@ -399,11 +400,14 @@ class Datasette: ), ] ) + env_extras = {} + if self.setting("strict_templates"): + env_extras["undefined"] = StrictUndefined self.jinja_env = Environment( loader=template_loader, autoescape=True, enable_async=True, - undefined=StrictUndefined, + **env_extras, ) self.jinja_env.filters["escape_css_string"] = escape_css_string self.jinja_env.filters["quote_plus"] = urllib.parse.quote_plus Explored this idea a bit in: - #1999

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2038/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1590183272 I_kwDOBm6k_c5eyEVo 2027 How to redirect from "/" to a specific db/table dmick 1350673 open 0     4 2023-02-18T03:14:01Z 2023-03-08T04:42:22Z   NONE  

Using nginx to redirect public IP to the local uvicorn server as 'normal'. I can't figure out how to redirect such that '/' results in accessing the one db/table I want to serve; redirecting / to /db/table breaks some of the CSS; fooling with base_url doesn't seem to help. Can someone explain this, if it's possible?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2027/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1605959201 I_kwDOBm6k_c5fuP4h 2032 datasette errors when foreign key integrity is enabled cldellow 193185 open 0     0 2023-03-02T01:27:51Z 2023-03-02T01:31:58Z   CONTRIBUTOR  

By default, SQLite does not enforce foreign key constraints. I typically enable these checks by running:

sql PRAGMA foreign_keys = ON;

inside of a prepare_connection hook.

If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with:

FOREIGN KEY constraint failed

This could be resolved by either: - deleting from the tables column last - changing the schema so that the foreign keys have ON DELETE CASCADE

Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in prepare_connection and trying not to enable fkey checks on the _internal database.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2032/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
828858421 MDU6SXNzdWU4Mjg4NTg0MjE= 1258 Allow canned query params to specify default values wdccdw 1385831 open 0     5 2021-03-11T07:19:02Z 2023-02-20T23:39:58Z   NONE  

If I call a canned query that includes named parameters, without passing any parameters, datasette runs the query anyway, resulting in an HTTP status code 400, and a visible error in the browser, with only a link back to home. This means that one of the default links on https://site/database/ will lead to a broken page with no apparent way out.

Is there any way to skip performing the query when parameters aren't supplied, but otherwise render the usual canned query page? Alternatively, can I supply default values for my parameters, either when defining my canned queries or when linking to the canned query page from the default database template.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1258/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1592327343 I_kwDOBm6k_c5e6Pyv 2029 Sorry Simon, didn't know how else to contact you llchristopherson 5804626 open 0     0 2023-02-20T19:02:53Z 2023-02-20T19:02:53Z   NONE  

Hi Simon,

Would you be willing to chat with me about Datasette? I have some questions. I am working on a project to evaluate data ingestion tools for a research organization and I ran across Datasette. I have looked through a lot of your documentation, but still have some questions, which are very specific. If you would be willing to write me back about this, my email is laura@renci.org.

Thanks, Laura

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2029/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1323346408 I_kwDOBm6k_c5O4Kno 1775 i18n support johnfelipe 428820 open 0     9 2022-07-31T02:51:04Z 2023-02-10T18:04:40Z   NONE  

I want contribute for translate UI to es, de, de and it if you share strings

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1775/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1579973223 I_kwDOBm6k_c5eLHpn 2024 Mention WAL mode in documentation simonw 9599 open 0     1 2023-02-10T16:11:10Z 2023-02-10T16:11:53Z   OWNER  

It's not currently obvious from the docs how you can ensure that Datasette runs well in situations where other processes may update the underlying SQLite files.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2024/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1577548579 I_kwDOBm6k_c5eB3sj 2021 Docker images for 1.0 alphas? meowcat 1563881 open 0     0 2023-02-09T09:35:52Z 2023-02-09T09:35:52Z   NONE  

Hi, would you consider putting 1.0alpha images on Dockerhub?

(Also, how usable are the alphas?)

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2021/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1575880841 I_kwDOBm6k_c5d7giJ 2020 Documentation refers to "off" setting; doesn't seem to work, "false" does dmick 1350673 open 0     0 2023-02-08T10:38:10Z 2023-02-08T10:38:10Z   NONE  

https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using "off" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a "JSON boolean" and have the values "true" or "false". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2020/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1573424830 I_kwDOBm6k_c5dyI6- 2019 Refactor out the keyset pagination code simonw 9599 open 0     14 2023-02-06T23:04:00Z 2023-02-08T01:40:46Z   OWNER  

While working on: - #1999

I noticed that some of the most complex code in the existing table view is the code that implements keyset pagination:

https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/views/table.py#L417-L493

Extracting that into a utility function would simplify that code a lot.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2019/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1571711808 I_kwDOBm6k_c5drmtA 2018 `check_visibility` gives confusing (wrong?) results if permission is `None` cldellow 193185 open 0     0 2023-02-06T01:03:08Z 2023-02-06T01:03:46Z   CONTRIBUTOR  

I'm trying to gate access to an edit UI on the user having update-row on the underlying view or table.

I expected datasette.check_visibility to be a good way to do this:

```python visible, private = await datasette.check_visibility( request.actor, permissions=[ ("update-row", (database, table)), ], )

if not visible:
    return None

```

But visible is returning true, even when there is no explicit update-row permission. (In this case, request.actor is None.)

Based on the update-row permissions docs, I expected this to be default deny, and so no explicit permission would result in false.

I think the root cause is that check_visibility calls ensure_permissions and expects it to throw if the permission is not available.

But ensure_permissions does not throw when permission_allowed returns None: https://github.com/simonw/datasette/blob/1.0a2/datasette/app.py#L825-L829

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2018/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1571207083 I_kwDOBm6k_c5dprer 2016 Database metadata fields like description are not available in the index page template's context palewire 9993 open 0   Datasette 1.0 3268330 1 2023-02-05T02:25:53Z 2023-02-05T22:56:43Z   NONE  

When looping through databases in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2016/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1565179870 I_kwDOBm6k_c5dSr_e 2013 Datasette uses non-standard quoting for identifiers cldellow 193185 open 0     0 2023-02-01T00:05:39Z 2023-02-01T00:06:30Z   CONTRIBUTOR  

Related to #2001, but where #2001 was about literals, this is about identifiers

From https://www.sqlite.org/lang_keywords.html:

"keyword" A keyword in double-quotes is an identifier. [keyword] A keyword enclosed in square brackets is an identifier. This is not standard SQL. This quoting mechanism is used by MS Access and SQL Server and is included in SQLite for compatibility.

Datasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/init.py#L345-L349, in some of the other DB access code, and in some of the test fixtures.

Migrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2013/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1564774831 I_kwDOBm6k_c5dRJGv 2012 Missing space in database summary simonw 9599 open 0     0 2023-01-31T18:01:13Z 2023-01-31T18:01:13Z   OWNER  

Spotted this on an instance index page:

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2012/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1564769997 I_kwDOBm6k_c5dRH7N 2011 Applied facet did not result in an "x" icon to dismiss it simonw 9599 open 0     1 2023-01-31T17:57:44Z 2023-01-31T17:58:54Z   OWNER  

That's against this data https://data.sfgov.org/City-Management-and-Ethics/Supplier-Contracts/cqi5-hm2d imported using https://datasette.io/plugins/datasette-socrata

It's for Contract Type of Non-Purchasing Contract (Rents, etc.) - so possible that some of the spaces or punctuation in either the name of the value tripped up the code that decides if the X icon should be displayed.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2011/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1563264257 I_kwDOBm6k_c5dLYUB 2010 Row page should default to card view simonw 9599 open 0   Datasette 1.0 3268330 1 2023-01-30T21:49:37Z 2023-01-30T21:52:06Z   OWNER  

Datasette currently uses the same table layout on the row pages as it does on the table pages:

https://datasette.io/content/pypi_packages?_sort=name&name__exact=datasette-column-inspect

https://datasette.io/content/pypi_packages/datasette-column-inspect

If you shrink down to mobile width you get this instead, on both of those pages:

I think that view, which I think of as the "card view", is plain better if you're looking at just a single row - and it (or a variant of it) should be the default presentation on the row page.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2010/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1186696202 I_kwDOBm6k_c5Gu4wK 1696 Show foreign key label when filtering simonw 9599 open 0     2 2022-03-30T16:18:54Z 2023-01-29T20:56:20Z   OWNER  

For example here:

3 corresponds to "Human Related: Other" - it would be neat to display this in this area of the page somehow.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1696/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1515815014 I_kwDOBm6k_c5aWYBm 1973 render_cell plugin hook's row object is not a sqlite.Row cldellow 193185 open 0     4 2023-01-01T20:27:46Z 2023-01-29T00:40:31Z   CONTRIBUTOR  

From https://docs.datasette.io/en/stable/plugin_hooks.html#render-cell-row-value-column-table-database-datasette:

row - sqlite.Row The SQLite row object that the value being rendered is part of

This appears to actually be a CustomRow, but I think that's unrelated to my issue.

I have a table:

sql CREATE TABLE IF NOT EXISTS "dss_job_stats"( job_id integer not null references dss_job(id) on delete cascade, host text not null, // other columns elided as irrelevant primary key (job_id, host) );

On datasette 0.63.2, the render_cell hook receives a row value that looks like:

CustomRow([('job_id', {'value': 2, 'label': '2'}), ('host', 'cldellow.com')])

I expected the job_id value to be 2, but it's actually {'value': 2, 'label': '2'}.

I can work around this, but was wondering if this was intended behaviour?

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1973/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1558644003 I_kwDOBm6k_c5c5wUj 2006 Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release simonw 9599 open 0   Datasette 1.0 3268330 2 2023-01-26T19:17:40Z 2023-01-26T19:20:53Z   OWNER  

I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes.

I can hopefully help avoid that by shipping one last entry in the 0.x series that ensures datasette publish pins to <1.0 when it installs Datasette itself.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2006/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1557507274 I_kwDOBm6k_c5c1azK 2005 `extra_template_vars` should be OK to return `None` simonw 9599 open 0     1 2023-01-26T01:40:45Z 2023-01-26T01:41:50Z   OWNER  

Got this exception and had to make sure it always returned {}:

File ".../python3.11/site-packages/datasette/app.py", line 1049, in render_template assert isinstance(extra_vars, dict), "extra_vars is of type {}".format( AssertionError: extra_vars is of type <class 'NoneType'>

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2005/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1553615704 I_kwDOBm6k_c5cmktY 2001 Datasette is not compatible with SQLite's strict quoting compilation option gwk 406380 open 0     4 2023-01-23T19:10:07Z 2023-01-25T04:59:58Z   NONE  

I have linked Python3.11 on macOS against recent SQLite that was compiled using -DSQLITE_DQS=0. This option disables interpretation of double-quoted identifiers as string literals, described in the SQLite docs as a "MySQL 3.x misfeature". See https://www.sqlite.org/quirks.html#dblquote for background.

Datasette uses the double-quote syntax in a number of key places, and is thus completely broken in this environment.

My experience was to pip install datasette, then run datasette serve -I my-data.db. When I visit http://127.0.0.1:8001 I get a 500 response.

The error: sqlite3.OperationalError: no such column: geometry_columns

The responsible SQL: 'select 1 from sqlite_master where tbl_name = "geometry_columns"'

I then installed datasette from GitHub master in development mode and changed the offending SQL to use correct quotes: "select 1 from sqlite_master where tbl_name = 'geometry_columns'".

With this change, I get a little further, but have the same problem with the first table name in my database (in my case, "Meta"): OperationalError: no such column: Meta Traceback (most recent call last): File "/Users/gwk/external/datasette/datasette/app.py", line 1522, in route_path response = await view(request, send) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/views/base.py", line 151, in view return await self.dispatch_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/views/base.py", line 105, in dispatch_request response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/views/index.py", line 70, in get "fts_table": await db.fts_table(table), ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/database.py", line 363, in fts_table return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/database.py", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/py/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/database.py", line 211, in in_thread return fn(conn) ^^^^^^^^ File "/Users/gwk/external/datasette/datasette/database.py", line 363, in <lambda> return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/gwk/external/datasette/datasette/utils/__init__.py", line 588, in detect_fts rows = conn.execute(detect_fts_sql(table)).fetchall() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ sqlite3.OperationalError: no such column: Meta INFO: 127.0.0.1:50258 - "GET / HTTP/1.1" 500 Internal Server Error

I will try to continue playing with this, but I also hope that the datasette developers will enable this mode in a test environment as I am unlikely to be able to exercise all of the SQL in the codebase, or make a pull request very soon.

Note that the DQS setting compile-time option can be overridden at runtime with calls to the C API: sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DDL, 0, (void*)0); sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DML, 0, (void*)0);

As far as I can tell, sqlite3_db_config is not exposed in Python, but perhaps we could figure out how to invoke it using ctypes.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2001/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
743371103 MDU6SXNzdWU3NDMzNzExMDM= 1099 Support linking to compound foreign keys simonw 9599 open 0     6 2020-11-15T23:23:17Z 2023-01-25T00:58:26Z   OWNER  

Reported as a bug in #1098 because they caused 500 errors - but it would be even better if Datasette could hyperlink to related rows via compound foreign keys.

datasette 107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1099/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Queries took 566.401ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows