id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason
1983600865,PR_kwDOBm6k_c5e7WH7,2206,Bump the python-packages group with 1 update,49699333,open,0,,,1,2023-11-08T13:18:56Z,2023-12-08T13:46:24Z,,CONTRIBUTOR,simonw/datasette/pulls/2206,"Bumps the python-packages group with 1 update: [black](https://github.com/psf/black).
Release notes
Sourced from black's releases.
23.11.0
Highlights
- Support formatting ranges of lines with the new
--line-ranges
command-line option
(#4020)
Stable style
- Fix crash on formatting bytes strings that look like docstrings (#4003)
- Fix crash when whitespace followed a backslash before newline in a docstring (#4008)
- Fix standalone comments inside complex blocks crashing Black (#4016)
- Fix crash on formatting code like
await (a ** b)
(#3994)
- No longer treat leading f-strings as docstrings. This matches Python's behaviour and
fixes a crash (#4019)
Preview style
- Multiline dicts and lists that are the sole argument to a function are now
indented less (#3964)
- Multiline unpacked dicts and lists as the sole argument to a function are now also
indented less (#3992)
- In f-string debug expressions, quote types that are visible in the final string
are now preserved (#4005)
- Fix a bug where long
case
blocks were not split into multiple lines. Also enable
general trailing comma rules on case
blocks (#4024)
- Keep requiring two empty lines between module-level docstring and first function or
class definition (#4028)
- Add support for single-line format skip with other comments on the same line (#3959)
Configuration
- Consistently apply force exclusion logic before resolving symlinks (#4015)
- Fix a bug in the matching of absolute path names in
--include
(#3976)
Performance
- Fix mypyc builds on arm64 on macOS (#4017)
Integrations
- Black's pre-commit integration will now run only on git hooks appropriate for a code
formatter (#3940)
23.10.1
Highlights
- Maintanence release to get a fix out for GitHub Action edge case (#3957)
Preview style
... (truncated)
Changelog
Sourced from black's changelog.
23.11.0
Highlights
- Support formatting ranges of lines with the new
--line-ranges
command-line option
(#4020)
Stable style
- Fix crash on formatting bytes strings that look like docstrings (#4003)
- Fix crash when whitespace followed a backslash before newline in a docstring (#4008)
- Fix standalone comments inside complex blocks crashing Black (#4016)
- Fix crash on formatting code like
await (a ** b)
(#3994)
- No longer treat leading f-strings as docstrings. This matches Python's behaviour and
fixes a crash (#4019)
Preview style
- Multiline dicts and lists that are the sole argument to a function are now indented
less (#3964)
- Multiline unpacked dicts and lists as the sole argument to a function are now also
indented less (#3992)
- In f-string debug expressions, quote types that are visible in the final string are
now preserved (#4005)
- Fix a bug where long
case
blocks were not split into multiple lines. Also enable
general trailing comma rules on case
blocks (#4024)
- Keep requiring two empty lines between module-level docstring and first function or
class definition (#4028)
- Add support for single-line format skip with other comments on the same line (#3959)
Configuration
- Consistently apply force exclusion logic before resolving symlinks (#4015)
- Fix a bug in the matching of absolute path names in
--include
(#3976)
Performance
- Fix mypyc builds on arm64 on macOS (#4017)
Integrations
- Black's pre-commit integration will now run only on git hooks appropriate for a code
formatter (#3940)
23.10.1
Highlights
- Maintenance release to get a fix out for GitHub Action edge case (#3957)
... (truncated)
Commits
2a1c67e
Prepare release 23.11.0 (#4032)
72e7a2e
Remove redundant condition from has_magic_trailing_comma
(#4023)
1a7d9c2
Preserve visible quote types for f-string debug expressions (#4005)
f4c7be5
docs: fix minor typo (#4030)
2e4fac9
Apply force exclude logic before symlink resolution (#4015)
66008fd
[563] Fix standalone comments inside complex blocks crashing Black (#4016)
50ed622
Fix long case blocks not split into multiple lines (#4024)
46be1f8
Support formatting specified lines (#4020)
ecbd9e8
Fix crash with f-string docstrings (#4019)
e808e61
Preview: Keep requiring two empty lines between module-level docstring and fi...
- Additional commits viewable in compare view
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=black&package-manager=pip&previous-version=23.9.1&new-version=23.11.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
- `@dependabot ignore minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
- `@dependabot ignore ` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore ` will remove all of the ignore conditions of the specified dependency
- `@dependabot unignore ` will remove the ignore condition of the specified dependency and ignore conditions
----
:books: Documentation preview :books:: https://datasette--2206.org.readthedocs.build/en/2206/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
2029908157,I_kwDOBm6k_c54_fC9,2214,CSV export fails for some `text` foreign key references,2874,open,0,,,1,2023-12-07T05:04:34Z,2023-12-07T07:36:34Z,,NONE,,"I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research.
I'm using Datasette with the [SWITRS](https://iswitrs.chp.ca.gov/) data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com.
Their data makes extensive use of codes for incident column data (`1` for `Monday` and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or `-`. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read.
If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail.
The foreign key configuration is as follows:
```
# Some tables use integer ids, like sensible tables do. Let's import them first
# since we favor them.
for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY
do
sqlite-utils create-table records.db $TABLE id integer name text --pk=id
sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv
sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id
sqlite-utils create-index records.db collisions $TABLE
done
# *Other* tables use letter keys, like they were raised by WOLVES. Let's put them
# at the end of the import queue.
for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \
PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \
PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \
STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP
do
sqlite-utils create-table records.db $TABLE key text name text --pk=key
sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv
sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key
sqlite-utils create-index records.db collisions $TABLE
done
```
You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db
If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the ""advanced"" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output:
```
INFO: 127.0.0.1:57885 - ""GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1"" 200 OK
Caught this error:
```
(No other output follows `error:` other than a blank line.)
I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
2028698018,I_kwDOBm6k_c5463mi,2213,feature request: gzip compression of database downloads,536941,open,0,,,1,2023-12-06T14:35:03Z,2023-12-06T15:05:46Z,,CONTRIBUTOR,,"At the bottom of database pages, datasette gives users the opportunity to download the underlying sqlite database. It would be great if that could be served gzip compressed.
this is similar to #1213, but for me, i don't need datasette to compress html and json because my CDN layer does it for me, however, cloudflare at least, will not compress a mimetype of ""application""
(see list of mimetype: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
2023057255,I_kwDOBm6k_c54lWdn,2212,Can't filter with numbers,605070,open,0,,,0,2023-12-04T05:26:29Z,2023-12-04T05:26:29Z,,NONE,,"I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean).
I can't seem to get the facet to work or even filtering on this column.
My guess is that Datasette is ""stringifying"" the number and it's not matching?
Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
2019811176,I_kwDOBm6k_c54Y99o,2211,Unreachable exception handlers for `sqlite3.OperationalError`,1214074,open,0,,,0,2023-12-01T00:50:22Z,2023-12-01T00:50:22Z,,NONE,,"There are several places where `sqlite3.OperationalError` is caught as part of an exception handler which catches multiple exceptions, but is then caught again immediately afterwards by a dedicated exception handler.
Because the exception will be caught by the first handler, the logic in the second handler is unreachable and will never be executed. If this is intended behavior, the second handler can be removed. If this is not intended, and the second handler should be the one that catches this exception, then `sqlite3.OperationalError` should be removed from the tuple of exceptions in the first handler.
This issue was found via a CodeQL query on the repository, and I've listed the occurrences found by the query below. There may be other instances of this issue in the code that were not surfaced by the query. I'd be happy to share the query if others would like to view or run it.
One example:
https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/views/database.py#L534-L537
Other instances:
https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L266-L270
https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L452-L456",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
564833696,MDU6SXNzdWU1NjQ4MzM2OTY=,670,Prototoype for Datasette on PostgreSQL,9599,open,0,,,15,2020-02-13T17:17:55Z,2023-11-17T15:32:21Z,,OWNER,,"I thought this would never happen, but now that I'm deep in the weeds of running SQLite in production for Datasette Cloud I'm starting to reconsider my policy of only supporting SQLite.
Some of the factors making me think PostgreSQL support could be worth the effort:
- Serverless. I'm getting increasingly excited about writable-database use-cases for Datasette. If it could talk to PostgreSQL then users could easily deploy it on Heroku or other serverless providers that can talk to a managed RDS-style PostgreSQL.
- Existing databases. Plenty of organizations have PostgreSQL databases. They can export to SQLite using [db-to-sqlite](https://github.com/simonw/db-to-sqlite) but that's a pretty big barrier to getting started - being able to run `datasette postgresql://connection-string` and start trying it out would be a massively better experience.
- Data size. I keep running into use-cases where I want to run Datasette against many GBs of data. SQLite can do this but PostgreSQL is much more optimized for large data, especially given the existence of tools like Citus.
- Marketing. Convincing people to trust their data to SQLite is potentially a big barrier to adoption. Even if I've convinced myself it's trustworthy I still have to convince everyone else.
- It might not be that hard? If this required a ground-up rewrite it wouldn't be worth the effort, but I have a hunch that it may not be too hard - most of the SQL in Datasette should work on both databases since it's almost all portable SELECT statements. If Datasette did DML this would be a lot harder, but it doesn't.
- Plugins! This feels like a natural surface for a plugin - at which point people could add MySQL support and suchlike in the future.
The above reasons feel strong enough to justify a prototype.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/670/reactions"", ""total_count"": 19, ""+1"": 14, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 5, ""rocket"": 0, ""eyes"": 0}",,
1994861266,PR_kwDOBm6k_c5fhgOS,2209,Fix query for suggested facets with column named value,198537,open,0,,,3,2023-11-15T14:13:30Z,2023-11-15T15:31:12Z,,CONTRIBUTOR,simonw/datasette/pulls/2209,"See discussion in https://github.com/simonw/datasette/issues/2208
----
:books: Documentation preview :books:: https://datasette--2209.org.readthedocs.build/en/2209/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2209/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1994857251,I_kwDOBm6k_c525xsj,2208,No suggested facets when a column named 'value' is included,198537,open,0,,,1,2023-11-15T14:11:17Z,2023-11-15T14:18:59Z,,CONTRIBUTOR,,"When a column named 'value' is included there are no suggested facets is shown as the query uses an alias of 'value'.
https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L168-L174
Currently the following is shown (from https://latest.datasette.io/fixtures/facetable)
![image](https://github.com/simonw/datasette/assets/198537/a919509a-ea88-461b-b25b-8b776720c7c5)
When I add a column named 'value' only the JSON facets are processed.
![image](https://github.com/simonw/datasette/assets/198537/092bd0b3-4c20-434e-88f8-47e2b8994a1d)
I think that not using aliases could be a solution (except if someone wants to use a column named `count(*)` though this seems to be unlikely). I'll open a PR with that.
There is also a TODO with a similar question in the same file. I have not looked into that yet.
https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/facets.py#L512",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1994845152,I_kwDOBm6k_c525uvg,2207,ModuleNotFoundError: No module named 'click_default_group,283441,open,0,,,0,2023-11-15T14:04:32Z,2023-11-15T14:04:32Z,,NONE,,"No matter what I do, I'm getting this error:
```
$ datasette
Traceback (most recent call last):
File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/bin/datasette"", line 5, in
from datasette.cli import cli
File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/lib/python3.11/site-packages/datasette/cli.py"", line 6, in
from click_default_group import DefaultGroup
ModuleNotFoundError: No module named 'click_default_group'
```
I have datasette in my dependencies like this:
```toml
[tool.poetry.group.dev.dependencies]
datasette = {version = ""1.0a7"", allow-prereleases = true}
```
I had the latest regular version (not pre-release) there originally, but the result was the same:
```toml
[tool.poetry.group.dev.dependencies]
datasette = ""0.64.5""
```
Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1978023780,I_kwDOBm6k_c515j9k,2205,request.post_vars() method obliterates form keys with multiple values,9599,open,0,,8755003,3,2023-11-05T23:25:08Z,2023-11-06T04:10:34Z,,OWNER,,"https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L137-L139
In GET requests you can do `?foo=1&foo=2` - you can do the same in POST requests, but the `dict()` call here eliminates those duplicates.
You can't even try calling `post_body()` and implement your own custom parsing because of:
- #2204",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1978022687,I_kwDOBm6k_c515jsf,2204,request.post_body() can only be called once,9599,open,0,,,0,2023-11-05T23:22:03Z,2023-11-05T23:23:23Z,,OWNER,,"This code here:
https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L127-L135
It consumes the messages, which means if you try to call it a second time you won't be able to get at the body.
This is efficient - we don't end up with a `request` object property with potentially megabytes of content that we never look at again - but it's inconvenient for cases like middleware or functions where we don't know if the body has been consumed yet or not.
Potential solution: set `request._body` the first time it is called, and return that on subsequent calls.
Potential optimization: only do this for bodies that are shorter than a certain threshold - maybe 1MB - and raise an exception if you attempt to call `post_body()` multiple times against one of those larger bodies.
I'm a bit nervous about that option though, since it could result in errors that don't show up in testing but do show up in production.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
959137143,MDU6SXNzdWU5NTkxMzcxNDM=,1415,feature request: document minimum permissions for service account for cloudrun,536941,open,0,,,4,2021-08-03T13:48:43Z,2023-11-05T16:46:59Z,,CONTRIBUTOR,,"Thanks again for such a powerful project.
For deploying to cloudrun from github actions, I'd like to create a service account with minimal permissions.
It would be great to document what those minimum permission that need to be set in the IAM.
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1415/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1977726056,I_kwDOBm6k_c514bRo,2203,custom plugin not seen as sql function,7113541,open,0,,,0,2023-11-05T10:30:19Z,2023-11-05T10:30:19Z,,NONE,,"Hi, I'm not sure if this is the right repo for this issue.
I'm using datasette with the parquet (to read a duckdb), and jellyfish plugins. Both work perfectly.
Now I need to create a simple plugin that uses the python rouge package and returns a similarity score (similarly to how the jellyfish plugin works).
If I create a custom plugin, even the example hello_world one, copied directly from the tutorial, I get the following error:
```duckdb.duckdb.CatalogException: Catalog Error: Scalar Function with name hello_world does not exist!```
Since the jellyfish plugin doesn't do anything more complex, I'm wondering if there is some other kind of issue with my setup.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2203/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1955676270,I_kwDOBm6k_c50kUBu,2201,Discord invite link is invalid,11708906,open,0,,,0,2023-10-21T21:50:05Z,2023-10-21T21:50:05Z,,NONE,,"https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following:
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1163369515,I_kwDOBm6k_c5FV5wr,1655,query result page is using 400mb of browser memory 40x size of html page and 400x size of csv data,536941,open,0,,,8,2022-03-09T00:56:40Z,2023-10-17T21:53:17Z,,CONTRIBUTOR,,"[this page](https://labordata.bunkum.us/opdr-8335ea3?sql=with+most_recent_lu+as+%28%0D%0A++select%0D%0A++++*%0D%0A++from%0D%0A++++%28%0D%0A++++++select%0D%0A++++++++*%0D%0A++++++from%0D%0A++++++++lm_data%0D%0A++++++order+by%0D%0A++++++++f_num%2C%0D%0A++++++++receive_date+desc%0D%0A++++%29+t%0D%0A++group+by%0D%0A++++f_num%0D%0A%29%0D%0Aselect%0D%0A++aff_abbr+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+abbr_local_name%2C%0D%0A++coalesce%28%0D%0A++++regexp_match%28%27%28.*%3F%29%28%2C%3F+AFL-CIO%24%29%27%2C+union_name%29%2C%0D%0A++++regexp_match%28%27%28.*%3F%29%28+IND%24%29%27%2C+union_name%29%2C%0D%0A++++union_name%0D%0A++%29+%7C%7C+coalesce%28%27+local+%27+%7C%7C+desig_num%2C+%27+%27+%7C%7C+unit_name%29+as+full_local_name%2C%0D%0A++*%0D%0Afrom%0D%0A++most_recent_lu%0D%0Awhere+%28desig_num+IS+NOT+NULL+OR+unit_name+IS+NOT+NULL%29+AND+desig_name+%21%3D+%27HQ%27%0D%0Alimit%0D%0A++5000+offset+0)
is using about 400 mb in firefox 97 on mac os x. if you download the html for the page, it's about 11mb and if you get the csv for the data its about 1mb.
it's using over a 1G on chrome 99.
i found this because, i was trying to figure out why editing the SQL was getting very slow.
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1655/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1940346034,I_kwDOBm6k_c5zp1Sy,2199,Detailed upgrade instructions for metadata.yaml -> datasette.yaml,9599,open,0,,3268330,7,2023-10-12T16:21:25Z,2023-10-12T22:08:42Z,,OWNER,,"> `Exception: Datasette no longer accepts plugin configuration in --metadata. Move your ""plugins"" configuration blocks to a separate file - we suggest calling that datasette..json - and start Datasette with datasette -c datasette..json. See https://docs.datasette.io/en/latest/configuration.html for more details.`
>
> I think we should link directly to documentation that tells people how to perform this upgrade.
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/2190#issuecomment-1759947021_
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1931794126,I_kwDOBm6k_c5zJNbO,2198,--load-extension=spatialite not working with Windows,363004,open,0,,,0,2023-10-08T12:50:22Z,2023-10-08T12:50:22Z,,NONE,,"Using each of
`python -m datasette counties.db -m metadata.yml --load-extension=SpatiaLite`
and
`python -m datasette counties.db --load-extension=""C:\Windows\System32\mod_spatialite.dll""`
and
`python -m datasette counties.db --load-extension=C:\Windows\System32\mod_spatialite.dll`
I got the error:
```
File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\database.py"", line 209, in in_thread
self.ds._prepare_connection(conn, self.name)
File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\app.py"", line 596, in _prepare_connection
conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint])
sqlite3.OperationalError: The specified module could not be found.
```
I finally tried modifying the code in app.py to read:
```
def _prepare_connection(self, conn, database):
conn.row_factory = sqlite3.Row
conn.text_factory = lambda x: str(x, ""utf-8"", ""replace"")
if self.sqlite_extensions:
conn.enable_load_extension(True)
for extension in self.sqlite_extensions:
# ""extension"" is either a string path to the extension
# or a 2-item tuple that specifies which entrypoint to load.
#if isinstance(extension, tuple):
# path, entrypoint = extension
# conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint])
#else:
conn.execute(""SELECT load_extension('C:\Windows\System32\mod_spatialite.dll')"")
```
At which point the counties example worked.
Is there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used.
On an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line:
`python -m datasette rm_toucans_23_10_07.db counties.db -m metadata.yml --load-extension=SpatiaLite`
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2198/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1426379903,PR_kwDOBm6k_c5BtJNn,1870,"don't use immutable=1, only mode=ro",536941,open,0,,,7,2022-10-27T23:33:04Z,2023-10-03T19:12:37Z,,CONTRIBUTOR,simonw/datasette/pulls/1870,"Opening db files in immutable mode sometimes leads to the file being mutated, which causes duplication in the docker image layers: see #1836, #1480
That this happens in ""immutable"" mode is surprising, because the sqlite docs say that setting this should open the database as read only.
https://www.sqlite.org/c3ref/open.html
> immutable: The immutable parameter is a boolean query parameter that indicates that the database file is stored on read-only media. When immutable is set, SQLite assumes that the database file cannot be changed, even by a process with higher privilege, and so the database is opened read-only and all locking and change detection is disabled. Caution: Setting the immutable property on a database file that does in fact change can result in incorrect query results and/or [SQLITE_CORRUPT](https://www.sqlite.org/rescode.html#corrupt) errors. See also: [SQLITE_IOCAP_IMMUTABLE](https://www.sqlite.org/c3ref/c_iocap_atomic.html).
Perhaps this is a bug in sqlite?
----
:books: Documentation preview :books:: https://datasette--1870.org.readthedocs.build/en/1870/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1870/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
777333388,MDU6SXNzdWU3NzczMzMzODg=,1168,Mechanism for storing metadata in _metadata tables,9599,open,0,,,21,2021-01-01T18:47:27Z,2023-09-28T18:29:05Z,,OWNER,,"_Original title: Perhaps metadata should all live in a `_metadata` in-memory database_
Inspired by #1150 - metadata should be exposed as an API, and for large Datasette instances that API may need to be paginated. So why not expose it through an in-memory database table?
One catch to this: plugins. #860 aims to add a plugin hook for metadata. But if the metadata comes from an in-memory table, how do the plugins interact with it?
The need to paginate over metadata does make a plugin hook that returns metadata for an individual table seem less wise, since we don't want to have to do 10,000 plugin hook invocations to show a list of all metadata.
If those plugins write directly to the in-memory table how can their contributions survive the server restarting?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1865572575,PR_kwDOBm6k_c5Yt2eO,2155,Fix hupper.start_reloader entry point,79087,open,0,,,2,2023-08-24T17:14:08Z,2023-09-27T18:44:02Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2155,"Update hupper's entry point so that click commands are processed properly.
Fixes #2123
----
:books: Documentation preview :books:: https://datasette--2155.org.readthedocs.build/en/2155/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2155/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 2, ""eyes"": 0}",0,
1907765514,I_kwDOBm6k_c5xtjEK,2195,`datasette publish` needs support for the new config/metadata split,9599,open,0,,,9,2023-09-21T21:08:12Z,2023-09-21T22:57:48Z,,OWNER,,"> ... which raises the challenge that `datasette publish` doesn't yet know what to do with a config file!
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/2194#issuecomment-1730259871_
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1825007061,I_kwDOBm6k_c5sx2XV,2123,datasette serve when invoked with --reload interprets the serve command as a file,79087,open,0,,,2,2023-07-27T19:07:22Z,2023-09-18T13:02:46Z,,NONE,,"When running `datasette serve` with the `--reload` flag, the serve command is picked up as a file argument:
```
$ datasette serve --reload test_db
Starting monitor for PID 13574.
Error: Invalid value for '[FILES]...': Path 'serve' does not exist.
Press ENTER or change a file to reload.
```
If a 'serve' file is created it launches properly (albeit with an empty database called serve):
```
$ touch serve; datasette serve --reload test_db
Starting monitor for PID 13628.
INFO: Started server process [13628]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
```
Version (running from HEAD on main):
```
$ datasette --version
datasette, version 1.0a2
```
This issue appears to have existed for awhile as https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 mentions the error in a different context.
I'm happy to debug and land a patch if it's welcome.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2123/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1900026059,I_kwDOBm6k_c5xQBjL,2188,"Plugin Hooks for ""compile to SQL"" languages",15178711,open,0,,,2,2023-09-18T01:37:15Z,2023-09-18T06:58:53Z,,CONTRIBUTOR,,"There's a ton of tools/languages that compile to SQL, which may be nice in Datasette. Some examples:
- Logica https://logica.dev
- PRQL https://prql-lang.org
- Malloy, but not sure if it works with SQLite? https://github.com/malloydata/malloy
It would be cool if plugins could extend Datasette to use these languages, in both the code editor and API usage.
A few things I'd imagine a `datasette-prql` or `datasette-logica` plugin would do:
- `prql=` instead of `sql=`
- Code editor support (syntax highlighting, autocomplete)
- Hide/show SQL",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
787098345,MDU6SXNzdWU3ODcwOTgzNDU=,1191,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,9599,open,0,,3268330,12,2021-01-15T18:18:51Z,2023-09-18T06:55:52Z,,OWNER,,"Sometimes a plugin may want to add content to an existing default template - for example `datasette-search-all` adds a new search box at the top of `index.html`. I also want `datasette-upload-csvs` to add a CTA on the `database.html` page: https://github.com/simonw/datasette-upload-csvs/issues/18
Currently plugins can do this by providing a new version of the `index.html` template - but if multiple plugins try to do that only one of them will succeed.
It would be better if there were known areas of those templates which plugins could add additional content to, such that multiple plugins can use the same spot.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1191/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1899310542,I_kwDOBm6k_c5xNS3O,2187,Datasette for serving JSON only,19705106,open,0,,,0,2023-09-16T05:48:29Z,2023-09-16T05:48:29Z,,NONE,,"Hi, is there any way to use datasette for serving json only without displaying webpage? I've tried to search about this in documentation but didn't get any information",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1898927976,I_kwDOBm6k_c5xL1do,2186,Mechanism for register_output_renderer hooks to access full count,9599,open,0,,3268330,2,2023-09-15T18:57:54Z,2023-09-15T19:27:59Z,,OWNER,,"The cause of this bug:
- https://github.com/simonw/datasette-export-notebook/issues/17
Is that `datasette-export-notebook` was consulting `data[""filtered_table_rows_count""]` in the render output plugin function in order to show the total number of rows that would be exported.
That field is no longer available by default - the `""count""` field is only available if `?_extra=count` was passed.
It would be useful if plugins like this could access the total count on demand, should they need to.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1895266807,I_kwDOBm6k_c5w93n3,2184,Design decision - should configuration be exposed at /-/config ?,9599,open,0,,,0,2023-09-13T21:07:08Z,2023-09-13T21:07:38Z,,OWNER,,"> This made me think. That `{""$env"": ""ENV_VAR""}` hack was introduced back here:
>
> - https://github.com/simonw/datasette/issues/538
>
> The problem it was solving was that metadata was visible to everyone with access to the instance at `/-/metadata` but plugins clearly needed a way to set secret settings.
>
> Now that this stuff is moving to config, we have some decisions to make:
>
> 1. Add `/-/config` to let people see the configuration of their instance, and keep the `$env` trick for secret settings.
> 2. Say all configuration aside from metadata is secret and make `$env` optional or ditch it entirely.
> 3. Allow plugins to announce which of their configuration options are secret so we can automatically redact them from `/-/config`
>
> I've found `/-/metadata` extraordinarily useful as a user of Datasette - it really helps me understand exactly what's going on if I run into any problems with a plugin, if I can quickly check what the settings look like.
>
> So I'm leaning towards option 1 or 3.
_Originally posted by @simonw in https://github.com/simonw/datasette/pull/2183#discussion_r1325076924_
Also refs:
- #2093",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1781530343,I_kwDOBm6k_c5qL_7n,2093,"Proposal: Combine settings, metadata, static, etc. into a single `datasette.yaml` File",15178711,open,0,,,8,2023-06-29T21:18:23Z,2023-09-11T20:19:32Z,,CONTRIBUTOR,,"Very often I get tripped up when trying to configure my Datasette instances. For example: if I want to change the port my app listen too, do I do that with a CLI flag, a `--setting` flag, inside `metadata.json`, or an env var? If I want to up the time limit of SQL statements, is that under `metadata.json` or a setting? Where does my plugin configuration go?
Normally I need to look it up in Datasette docs, and I quickly find my answer, but the number of places where ""config"" goes it overwhelming.
- Flat CLI flags like `--port`, `--host`, `--cors`, etc.
- `--setting`, like `default_page_size`, `sql_time_limit_ms` etc
- Inside `metadata.json`, including plugin configuration
Typically my Datasette deploys are extremely long shell commands, with multiple `--setting` and other CLI flags.
## Proposal: Consolidate all ""config"" into `datasette.toml`
I propose that we add a new `datasette.toml` that combines ""settings"", ""metadata"", and other common CLI flags like `--port` and `--cors` into a single file. It would be similar to ""Cargo.toml"" in Rust projects, ""package.json"" in Node projects, and ""pyproject.toml"" in Python, etc.
A sample of what it could look like:
```toml
# ""top level"" configuration that are currently CLI flags on `datasette serve`
[config]
port = 8020
host = ""0.0.0.0""
cors = true
# replaces multiple `--setting` flags
[settings]
base_url = ""/app/datasette/""
default_allow_sql = true
sql_time_limit_ms = 3500
# replaces `metadata.json`.
# The contents of datasette-metadata.json could be defined in this file instead, but supporting separate files is nice (since those are easy to machine-generate)
[metadata]
include=""./datasette-metadata.json""
# plugin-specific
[plugins]
[plugins.datasette-auth-github]
client_id = {env = ""DATASETTE_AUTH_GITHUB_CLIENT_ID""}
client_secret = {env = ""GITHUB_CLIENT_SECRET""}
[plugins.datasette-cluster-map]
latitude_column = ""lat""
longitude_column = ""lon""
```
## Pros
- Instead of multiple files and CLI flags, everything could be in one tidy file
- Editing config in a separate file is easier than editing CLI flags, since you don't have to kill a process + edit a command every time
- New users will know ""just edit my `datasette.toml` instead of needing to learn metadata + settings + CLI flags
- Better dev experience for multiple environment. For example, could have `datasette -c datasette-dev.toml` for local dev environments (enables SQL, debug plugins, long timeouts, etc.), and a `datasette -c datasette-prod.toml` for ""production"" (lower timeouts, less plugins, monitoring plugins, etc.)
## Cons
- Yet another config-management system. Now Datasette users will need to know about metadata, settings, CLI flags, _and_ `datasette.toml`. However with enough documentation + announcements + examples, I think we can get ahead of it.
- If toml is chosen, would need to add a toml parser for Python version <3.11
- Multiple sources of config require priority. For example: Would `--setting default_allow_sql off` override the value inside `[settings]`? What about `--port`?
## Other Notes
### Toml
I chose toml over json because toml supports comments. I chose toml over yaml because Python 3.11 has builtin support for it. I also find toml easier to work with since it doesn't have the odd ""gotchas"" that YAML has (""ex `3.10` resolving to `3.1`, Norway `NO` resolving to `false`, etc.). It also mimics `pyproject.toml` which is nice. Happy to change my mind about this however
### Plugin config will be difficult
Plugin config is currently in `metadata.json` in two places:
1. Top level, under `""plugins.[plugin-name]""`. This fits well into `datasette.toml` as `[plugins.plugin-name]`
2. Table level, under `""databases.[db-name].tables.[table-name].plugins.[plugin-name]`. This doesn't fit that well into `datasette.toml`, unless it's nested under `[metadata]`?
### Extensions, static, one-off plugins?
We could also include equivalents of `--plugins-dir`, `--static`, and `--load-extension` into `datasette.toml`, but I'd imagine there's a few security concerns there to think through.
### Explicitly list with plugins to use?
I believe Datasette by default will load all install plugins on startup, but maybe `datasette.toml` can specify a list of plugins to use? For example, a dev version of `datasette.toml` can specify `datasette-pretty-traces`, but the prod version can leave it out",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2093/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1876353656,I_kwDOBm6k_c5v1uJ4,2168,Consider a request/response wrapping hook slightly higher level than asgi_wrapper(),9599,open,0,,,6,2023-08-31T21:42:04Z,2023-09-10T17:54:08Z,,OWNER,,"There's a long justification for why this might be needed here:
- https://github.com/simonw/datasette-auth-tokens/issues/10#issuecomment-1701820001
Short version: it would be neat if it was possible to stash some data on the `request` object such that a later plugin/middleware-type-thing could use that to influence the final returned response - similar to the kinds of things you can do with Django middleware.
The `asgi_wrapper()` mechanism doesn't have access to the request or response objects - it gets `scope` and can mess around with `receive` and `send`, but those are pretty low-level primitives.
Since Datasette has well-defined `request` and `response` objects now it might be nice to have a middleware layer that can manipulate those directly.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1884330740,PR_kwDOBm6k_c5ZszDF,2174,Use $DATASETTE_INTERNAL in absence of --internal,15178711,open,0,,,3,2023-09-06T16:07:15Z,2023-09-08T00:46:13Z,,CONTRIBUTOR,simonw/datasette/pulls/2174,"#refs 2157, specifically [this comment](https://github.com/simonw/datasette/issues/2157#issuecomment-1700291967)
Passing in `--internal my_internal.db` over and over again can get repetitive.
This PR adds a new configurable env variable `DATASETTE_INTERNAL_DB_PATH`. If it's defined, then it takes place as the path to the internal database. Users can still overwrite this behavior by passing in their own `--internal internal.db` flag.
In draft mode for now, needs tests and documentation.
Side note: Maybe we can have a sections in the docs that lists all the ""configuration environment variables"" that Datasette respects? I did a quick grep and found:
- `DATASETTE_LOAD_PLUGINS`
- `DATASETTE_SECRETS`
----
:books: Documentation preview :books:: https://datasette--2174.org.readthedocs.build/en/2174/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2174/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1010112818,I_kwDOBm6k_c48NRky,1479,"Win32 ""used by another process"" error with datasette publish",76450761,open,0,,,7,2021-09-28T19:12:00Z,2023-09-07T02:14:16Z,,NONE,,"I unfortunately was not successful to deploy to fly.io. Please see the details above of the three scenarios that I took. I am also new to datasette.
Failed to deploy. Attaching logs:
1. Tried with an app created via `flyctl apps create frosty-fog-8565` and the ran `datasette publish fly covid.db --app frosty-fog-8565`
```
Deploying frosty-fog-8565
==> Validating app configuration
--> Validating app configuration done
Services
TCP 80/443 ⇢ 8080
Error error connecting to docker: An unknown error occured.
Traceback (most recent call last):
File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main
""__main__"", mod_spec)
File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code
exec(code, run_globals)
File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__
return self.main(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main
rv = self.invoke(ctx)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke
return callback(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly
""--remote-only"",
File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__
next(self.gen)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory
tmp.cleanup()
File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup
_shutil.rmtree(self.name)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree
return _rmtree_unsafe(path, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe
_rmtree_unsafe(fullname, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe
onerror(os.rmdir, path, sys.exc_info())
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe
os.rmdir(path)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpgcm8cz66\\frosty-fog-8565'
```
2. Tried also with an app that gets autogenerate when running `flyctl launch`. This also generates the .toml file. Ran then `datasette publish fly covid.db --app dark-feather-168` **but different error now**
```Deploying dark-feather-168
==> Validating app configuration
Error not possible to validate configuration: server returned Post ""https://api.fly.io/graphql"": unexpected EOF
Traceback (most recent call last):
File ""c:\users\grott\anaconda3\lib\runpy.py"", line 193, in _run_module_as_main
""__main__"", mod_spec)
exec(code, run_globals)
File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__
return self.main(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main
rv = self.invoke(ctx)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke
return callback(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette_publish_fly\__init__.py"", line 156, in fly
""--remote-only"",
File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__
next(self.gen)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory
tmp.cleanup()
File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup
_shutil.rmtree(self.name)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree
return _rmtree_unsafe(path, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe
_rmtree_unsafe(fullname, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe
onerror(os.rmdir, path, sys.exc_info())
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe
os.rmdir(path)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpnoyewcre\\dark-feather-168'
```
These are also the contents of the generated **.toml file** in 2 scenario:
```
# fly.toml file generated for dark-feather-168 on 2021-09-28T20:35:44+02:00
app = ""dark-feather-168""
kill_signal = ""SIGINT""
kill_timeout = 5
processes = []
[env]
[experimental]
allowed_public_ports = []
auto_rollback = true
[[services]]
http_checks = []
internal_port = 8080
processes = [""app""]
protocol = ""tcp""
script_checks = []
[services.concurrency]
hard_limit = 25
soft_limit = 20
type = ""connections""
[[services.ports]]
handlers = [""http""]
port = 80
[[services.ports]]
handlers = [""tls"", ""http""]
port = 443
[[services.tcp_checks]]
grace_period = ""1s""
interval = ""15s""
restart_limit = 6
timeout = ""2s""
```
3. But also trying `datasette package covid.db` to create a local DOCKERFILE to later try to push it via `flyctl deploy` fails as well.
```[+] Building 147.3s (11/11) FINISHED
=> [internal] load build definition from Dockerfile 0.2s
=> => transferring dockerfile: 396B 0.0s
=> [internal] load .dockerignore 0.1s
=> => transferring context: 2B 0.0s
=> [internal] load metadata for docker.io/library/python:3.8 4.7s
=> [auth] library/python:pull token for registry-1.docker.io 0.0s
=> [internal] load build context 0.1s
=> => transferring context: 82.37kB 0.0s
=> [1/5] FROM docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3 108.3s
=> => resolve docker.io/library/python:3.8@sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5 0.0s
=> => sha256:56182bcdf4d4283aa1f46944b4ef7ac881e28b4d5526720a4e9ba03a4730846a 2.22kB / 2.22kB 0.0s
=> => sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 54.93MB / 54.93MB 21.6s
=> => sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 10.87MB / 10.87MB 15.5s
=> => sha256:530de807b46a11734e2587a784573c12c5034f2f14025f838589e6c0e3b5c5b6 1.86kB / 1.86kB 0.0s
=> => sha256:ff08f08727e50193dcf499afc30594c47e70cc96f6fcfd1a01240524624264d0 8.65kB / 8.65kB 0.0s
=> => sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 5.15MB / 5.15MB 6.4s
=> => sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 54.57MB / 54.57MB 37.7s
=> => sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 196.45MB / 196.45MB 82.3s
=> => sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 6.29MB / 6.29MB 26.6s
=> => extracting sha256:955615a668ce169f8a1443fc6b6e6215f43fe0babfb4790712a2d3171f34d366 5.4s
=> => sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 16.81MB / 16.81MB 40.5s
=> => extracting sha256:2756ef5f69a5190f4308619e0f446d95f5515eef4a814dbad0bcebbbbc7b25a8 0.6s
=> => extracting sha256:911ea9f2bd51e53a455297e0631e18a72a86d7e2c8e1807176e80f991bde5d64 0.6s
=> => sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 232B / 232B 40.1s
=> => extracting sha256:27b0a22ee906271a6ce9ddd1754fdd7d3b59078e0b57b6cc054c7ed7ac301587 5.8s
=> => sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 2.35MB / 2.35MB 41.8s
=> => extracting sha256:8584d51a9262f9a3a436dea09ba40fa50f85802018f9bd299eee1bf538481077 18.8s
=> => extracting sha256:524774b7d3638702fe9ae0ea3fcfb81b027dfd75cc2fc14f0119e764b9543d58 1.2s
=> => extracting sha256:9460f6b75036e38367e2f27bb15e85777c5d6cd52ad168741c9566186415aa26 2.9s
=> => extracting sha256:9bc548096c181514aa1253966a330134d939496027f92f57ab376cd236eb280b 0.0s
=> => extracting sha256:1d87379b86b89fd3b8bb1621128f00c8f962756e6aaaed264ec38db733273543 0.8s
=> [2/5] COPY . /app 2.3s
=> [3/5] WORKDIR /app 0.2s
=> [4/5] RUN pip install -U datasette 26.9s
=> [5/5] RUN datasette inspect covid.db --inspect-file inspect-data.json 3.1s
=> exporting to image 1.2s
=> => exporting layers 1.2s
=> => writing image sha256:b5db0c205cd3454c21fbb00ecf6043f261540bcf91c2dfc36d418f1a23a75d7a 0.0s
Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
Traceback (most recent call last):
""__main__"", mod_spec)
File ""c:\users\grott\anaconda3\lib\runpy.py"", line 85, in _run_code
exec(code, run_globals)
File ""C:\Users\grott\Anaconda3\Scripts\datasette.exe\__main__.py"", line 7, in
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__
return self.main(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 782, in main
rv = self.invoke(ctx)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File ""c:\users\grott\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke
return callback(*args, **kwargs)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette\cli.py"", line 283, in package
call(args)
File ""c:\users\grott\anaconda3\lib\contextlib.py"", line 119, in __exit__
next(self.gen)
File ""c:\users\grott\anaconda3\lib\site-packages\datasette\utils\__init__.py"", line 451, in temporary_docker_directory
tmp.cleanup()
File ""c:\users\grott\anaconda3\lib\tempfile.py"", line 811, in cleanup
_shutil.rmtree(self.name)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 516, in rmtree
return _rmtree_unsafe(path, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 395, in _rmtree_unsafe
_rmtree_unsafe(fullname, onerror)
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 404, in _rmtree_unsafe
onerror(os.rmdir, path, sys.exc_info())
File ""c:\users\grott\anaconda3\lib\shutil.py"", line 402, in _rmtree_unsafe
os.rmdir(path)
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\grott\\AppData\\Local\\Temp\\tmpkb27qid3\\datasette'```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1479/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1884408624,I_kwDOBm6k_c5wUcsw,2177,Move schema tables from _internal to _catalog,9599,open,0,,,1,2023-09-06T16:58:33Z,2023-09-06T17:04:30Z,,OWNER,,"This came up in discussion over:
- https://github.com/simonw/datasette/pull/2174
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2177/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1875739055,I_kwDOBm6k_c5vzYGv,2167,Document return type of await ds.permission_allowed(),9599,open,0,,,0,2023-08-31T15:14:23Z,2023-08-31T15:14:23Z,,OWNER,,"The return type isn't documented here: https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/docs/internals.rst#L327-L350
On inspecting the code I'm not 100% sure if it's possible for this. method to return `None`, or if it can only return `True` or `False`. Need to confirm that.
https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/datasette/app.py#L822C15-L853",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1865869205,I_kwDOBm6k_c5vNueV,2157,"Proposal: Make the `_internal` database persistent, customizable, and hidden",15178711,open,0,,,3,2023-08-24T20:54:29Z,2023-08-31T02:45:56Z,,CONTRIBUTOR,,"The current `_internal` database is used by Datasette core to cache info about databases/tables/columns/foreign keys of databases in a Datasette instance. It's a temporary database created at startup, that can only be seen by the root user. See an [example `_internal` DB here](https://latest.datasette.io/_internal), after [logging in as root](https://latest.datasette.io/login-as-root).
The current `_internal` database has a few rough edges:
- It's part of `datasette.databases`, so many plugins have to specifically exclude `_internal` from their queries [examples here](https://github.com/search?q=datasette+hookimpl+%22_internal%22+language%3APython+-path%3Adatasette%2F&ref=opensearch&type=code)
- It's only used by Datasette core and can't be used by plugins or 3rd parties
- It's created from scratch at startup and stored in memory. Why is fine, the performance is great, but persistent storage would be nice.
Additionally, it would be really nice if plugins could use this `_internal` database to store their own configuration, secrets, and settings. For example:
- `datasette-auth-tokens` [creates a `_datasette_auth_tokens` table](https://github.com/simonw/datasette-auth-tokens/blob/main/datasette_auth_tokens/__init__.py#L15) to store auth token metadata. This could be moved into the `_internal` database to avoid writing to the gues database
- `datasette-socrata` [creates a `socrata_imports`](https://github.com/simonw/datasette-socrata/blob/1409aa9b4d2fc3aff286b52e73af33b5786d56d0/datasette_socrata/__init__.py#L190-L198) table, which also can be in `_internal`
- `datasette-upload-csvs` [creates a `_csv_progress_`](https://github.com/simonw/datasette-upload-csvs/blob/main/datasette_upload_csvs/__init__.py#L154) table, which can be in `_internal`
- `datasette-write-ui` wants to have the ability for users to toggle whether a table appears editable, which can be either in `datasette.yaml` or on-the-fly by storing config in `_internal`
In general, these are specific features that Datasette plugins would have access to if there was a central internal database they could read/write to:
- **Dynamic configuration**. Changing the `datasette.yaml` file works, but can be tedious to restart the server every time. Plugins can define their own configuration table in `_internal`, and could read/write to it to store configuration based on user actions (cell menu click, API access, etc.)
- **Caching**. If a plugin or Datasette Core needs to cache some expensive computation, they can store it inside `_internal` (possibly as a temporary table) instead of managing their own caching solution.
- **Audit logs**. If a plugin performs some sensitive operations, they can log usage info to `_internal` for others to audit later.
- **Long running process status**. Many plugins (`datasette-upload-csvs`, `datasette-litestream`, `datasette-socrata`) perform tasks that run for a really long time, and want to give continue status updates to the user. They can store this info inside` _internal`
- **Safer authentication**. Passwords and authentication plugins usually store credentials/hashed secrets in configuration files or environment variables, which can be difficult to handle. Now, they can store them in `_internal`
## Proposal
- We remove `_internal` from [`datasette.databases`](https://docs.datasette.io/en/latest/internals.html#databases) property.
- We add new `datasette.get_internal_db()` method that returns the `_internal` database, for plugins to use
- We add a new `--internal internal.db` flag. If provided, then the `_internal` DB will be sourced from that file, and further updates will be persisted to that file (instead of an in-memory database)
- When creating internal.db, create a new `_datasette_internal` table to mark it a an ""datasette internal database""
- In `datasette serve`, we check for the existence of the `_datasette_internal` table. If it exists, we assume the user provided that file in error and raise an error. This is to limit the chance that someone accidentally publishes their internal database to the internet. We could optionally add a `--unsafe-allow-internal` flag (or database plugin) that allows someone to do this if they really want to.
## New features unlocked with this
These features don't really need a standardized `_internal` table per-say (plugins could currently configure their own long-time storage features if they really wanted to), but it would make it much simpler to create these kinds of features with a persistent application database.
- **`datasette-comments`** : A plugin for commenting on rows or specific values in a database. Comment contents + threads + email notification info can be stored in `_internal`
- **Bookmarks**: ""Bookmarking"" an SQL query could be stored in `_internal`, or a URL link shortener
- **Webhooks**: If a plugin wants to either consume a webhook or create a new one, they can store hashed credentials/API endpoints in `_internal`",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
594237015,MDU6SXNzdWU1OTQyMzcwMTU=,718,Plugin idea: datasette-redirects,9599,open,0,,,0,2020-04-05T03:41:38Z,2023-08-30T22:17:31Z,,OWNER,,"I just had to write a one-off custom plugin to redirect niche-musems.com to www.niche-museums.com (https://github.com/simonw/museums/issues/21) - it would be great if this kind of thing could be handled by a configurable plugin.
https://github.com/simonw/museums/blob/6b1faf00c463b2228860d4d62d104b11935e01b1/plugins/redirect_www.py",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/718/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,reopened
1865649347,I_kwDOBm6k_c5vM4zD,2156,datasette -s/--setting option for setting nested configuration options,9599,open,0,,,4,2023-08-24T18:09:27Z,2023-08-28T19:33:05Z,,OWNER,,"> I've been thinking about what it might look like to allow command-line arguments to be used to define _any_ of the configuration options in `datasette.yml`, as alternative and more convenient syntax.
>
> Here's what I've come up with:
> ```
> datasette \
> -s settings.sql_time_limit_ms 1000 \
> -s plugins.datasette-auth-tokens.manage_tokens true \
> -s plugins.datasette-auth-tokens.manage_tokens_database tokens \
> mydatabase.db tokens.db
> ```
> Which would be equivalent to `datasette.yml` containing this:
> ```yaml
> plugins:
> datasette-auth-tokens:
> manage_tokens: true
> manage_tokens_database: tokens
> settings:
> sql_time_limit_ms: 1000
> ```
More details in https://github.com/simonw/datasette/issues/2143#issuecomment-1690792514
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1866815458,PR_kwDOBm6k_c5YyF-C,2159,Implement Dark Mode colour scheme,3315059,open,0,,,0,2023-08-25T10:46:23Z,2023-08-25T10:46:35Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2159,"Closes #2095.
----
:books: Documentation preview :books:: https://datasette--2159.org.readthedocs.build/en/2159/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2159/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1,
1865983069,PR_kwDOBm6k_c5YvQSi,2158,add brand option to metadata.json.,52261150,open,0,,,0,2023-08-24T22:37:41Z,2023-08-24T22:37:57Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2158,"This adds a brand link to the top navbar if 'brand' key is populated in metadata.json. The link will be either '#' or use the contents of 'brand_url' in metadata.json for href.
I was able to get this done on my own site by replacing `templates/_crumbs.html` with a custom version, but I thought it would be nice to incorporate this in the tool directly.
![image](https://github.com/simonw/datasette/assets/52261150/fdfe9bb5-fee4-466c-8074-6132071d94e6)
----
:books: Documentation preview :books:: https://datasette--2158.org.readthedocs.build/en/2158/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2158/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
814595021,MDU6SXNzdWU4MTQ1OTUwMjE=,1241,Share button for copying current URL,7107523,open,0,,,6,2021-02-23T15:55:40Z,2023-08-24T20:09:52Z,,NONE,,"I use datasette in an `iframe` inside another HTML file that contains other ways to represent my data (mostly leaflets maps built with R on summarized data), and the datasette `iframe` is a tab in that page.
This particular use prevents users to access the full URLs of their datasette views and queries, which is a shame because the way datasette handles URLs to make every view or query easy to share is awesome. I know how to get the URL from the context menu of my browser, but I don't think many visitors would do it or even notice that datasette uses permalinks for pretty much every action they do. Would it be possible to add a ""Share link"" button to the interface, either in datasette itself or in a plugin?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1241/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1855885427,I_kwDOBm6k_c5unpBz,2143,De-tangling Metadata before Datasette 1.0,15178711,open,0,,,24,2023-08-18T00:51:50Z,2023-08-24T18:28:27Z,,CONTRIBUTOR,,"Metadata in Datasette is a really powerful feature, but is a bit difficult to work with. It was initially a way to add ""metadata"" about your ""data"" in Datasette instances, like descriptions for databases/tables/columns, titles, source URLs, licenses, etc. But it later became the go-to spot for other Datasette features that have nothing to do with metadata, like permissions/plugins/canned queries.
Specifically, I've found the following problems when working with Datasette metadata:
1. Metadata cannot be updated without re-starting the entire Datasette instance.
2. The `metadata.json`/`metadata.yaml` has become a kitchen sink of unrelated (imo) features like plugin config, authentication config, canned queries
3. The Python APIs for defining extra metadata are a bit awkward (the `datasette.metadata()` class, `get_metadata()` hook, etc.)
## Possible solutions
Here's a few ideas of Datasette core changes we can make to address these problems.
### Re-vamp the Datasette Python metadata APIs
The Datasette object has a single `datasette.metadata()` method that's a bit difficult to work with. There's also no Python API for inserted new metadata, so plugins have to rely on the `get_metadata()` hook.
The `get_metadata()` hook can also be improved - it doesn't work with async functions yet, so you're quite limited to what you can do.
(I'm a bit fuzzy on what to actually do here, but I imagine it'll be very small breaking changes to a few Python methods)
### Add an optional `datasette_metadata` table
Datasette should detect and use metadata stored in a new special table called `datasette_metadata`. This would be a regular table that a user can edit on their own, and would serve as a ""live updating"" source of metadata, than can be changed while the Datasette instance is running.
Not too sure what the schema would look like, but I'd imagine:
```sql
CREATE TABLE datasette_metadata(
level text,
target any,
key text,
value any,
primary key (level, target)
)
```
Every row in this table would map to a single metadata ""entry"".
- `level` would be one of ""datasette"", ""database"", ""table"", ""column"", which is the ""level"" the entry describes. For example, `level=""table""` means it is metadata about a specific table, `level=""database""` for a specific database, or `level=""datasette""` for the entire Datasette instance.
- `target` would ""point"" to the specific object the entry metadata is about, and would depend on what `level` is specific.
- `level=""database""`: `target` would be the string name of the database that the metadata entry is about. ex `""fixtures""`
- `level=""table""`: `target` would be a JSON array of two strings. The first element would be the database name, and the second would be the table name. ex `[""fixtures"", ""students""]`
- `level=""column""`: `target` would be a JSON array of 3 strings: The database name, table name, and column name. Ex `[""fixtures"", ""students"", ""student_id""`]
- `key` would be the type of metadata entry the row has, similar to the current ""keys"" that exist in `metadata.json`. Ex `""about_url""`, `""source""`, `""description""`, etc
- `value` would be the text value of be metadata entry. The literal text value of a description, about_url, column_label, etc
A quick sample:
level | target | key | value
-- | -- | -- | --
datasette | NULL | title | my datasette title...
db | fixtures | source |
table | [""fixtures"", ""students""] | label_column | student_name
column | [""fixtures"", ""students"", ""birthdate""] | description |
This `datasette_metadata` would be configured with other tools, and hopefully not manually by end users. Datasette Core could also offer a UI for editing entries in `datasette_metadata`, to update descriptions/columns on the fly.
### Re-vamp `metadata.json` and move non-metadata config to another place
The motivation behind this is that it's awkward that `metadata.json` contains config about things that are not strictly metadata, including:
- Plugin configuration
- [Authentication/permissions](https://docs.datasette.io/en/latest/authentication.html#access-permissions-in-metadata) (ex the `allow` key on datasettes/databases/tables
- Canned queries. might be controversial, but in my mind, canned queries are application-specific code and configuration, and don't describe the data that exists in SQLite databases.
I think we should move these outside of `metadata.json` and into a different file. The `datasette.json` idea in #2093 may be a good solution here: plugin/permissions/canned queries can be defined in `datasette.json`, while `metadata.json`/`datasette_metadata` will strictly be about documenting databases/tables/columns.
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1858228057,I_kwDOBm6k_c5uwk9Z,2147,Plugin hook for database queries that are run,18899,open,0,,,6,2023-08-20T18:43:50Z,2023-08-24T03:54:35Z,,NONE,,"I'm interested in making a plugin that saves every query that gets run to a table in the database. (I know about datasette-query-history but thought it would be good to have a server-side option.)
As far as I can tell reading the docs, there isn't really a hook setup to allow this.
Maybe I could hack it with some of the hooks that are passed requests, but that doesn't seem good.
I'm a little surprised this isn't possible, so I thought I would open an issue and see if that's a deeply considered decision or just ""haven't needed it yet."" I'm potentially interested in implementing the hook if the latter.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2147/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1864112887,PR_kwDOBm6k_c5Yo7bk,2151,Test Datasette on multiple SQLite versions,15178711,open,0,,,1,2023-08-23T22:42:51Z,2023-08-23T22:58:13Z,,CONTRIBUTOR,simonw/datasette/pulls/2151,"still testing, hope it works!
----
:books: Documentation preview :books:: https://datasette--2151.org.readthedocs.build/en/2151/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2151/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1,
459509126,MDU6SXNzdWU0NTk1MDkxMjY=,516,Enforce import sort order with isort,9599,open,0,,,8,2019-06-22T20:35:50Z,2023-08-23T02:15:36Z,,OWNER,,"I want to use isort to order imports. A few steps here:
- [x] Add a .isort.cfg file (see below)
- [x] Use `isort -rc` to reformat existing code
- [ ] Commit this change
- [x] Add a unit test that ensures future changes remain isort compatible",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1857234285,I_kwDOBm6k_c5usyVt,2145,If a row has a primary key of `null` various things break,9599,open,0,,,23,2023-08-18T20:06:28Z,2023-08-21T17:30:01Z,,OWNER,,"Stumbled across this while experimenting with `datasette-write-ui`. The error I got was a 500 on the `/db` page:
> `'NoneType' object has no attribute 'encode'`
Tracked it down to this code, which assembles the URL for a row page:
https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L120-L134
That's because `tilde_encode` can't handle `None`: https://github.com/simonw/datasette/blob/943df09dcca93c3b9861b8c96277a01320db8662/datasette/utils/__init__.py#L1175-L1178
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
476852861,MDU6SXNzdWU0NzY4NTI4NjE=,568,Add database_color as a configurable option,50906992,open,0,,,1,2019-08-05T13:14:45Z,2023-08-11T05:19:42Z,,NONE,,This would be really useful as it would allow us to tie in with colour schemes.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1802613340,PR_kwDOBm6k_c5VZhfw,2100,Make primary key view accessible to render_cell hook,1563881,open,0,,,0,2023-07-13T09:30:36Z,2023-08-10T13:15:41Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2100,"
----
:books: Documentation preview :books:: https://datasette--2100.org.readthedocs.build/en/2100/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1838469176,I_kwDOBm6k_c5tlNA4,2127,Context base class to support documenting the context,9599,open,0,,3268330,3,2023-08-07T00:01:02Z,2023-08-10T01:30:25Z,,OWNER,,"This idea first came up here:
- https://github.com/simonw/datasette/issues/2112#issuecomment-1652751140
If `datasette.render_template(...)` takes an optional `Context` subclass as an alternative to a context dictionary, I could then use dataclasses to define the context made available to specific templates - which then gives me something I can use to help document what they are.
Also refs:
- https://github.com/simonw/datasette/issues/1510",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1843821954,I_kwDOBm6k_c5t5n2C,2137,Redesign row default JSON,9599,open,0,,8755003,1,2023-08-09T18:49:11Z,2023-08-09T19:02:47Z,,OWNER,,"This URL here:
https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables
```json
{
""database"": ""fixtures"",
""table"": ""simple_primary_key"",
""rows"": [
{
""id"": ""1"",
""content"": ""hello""
}
],
""columns"": [
""id"",
""content""
],
""primary_keys"": [
""id""
],
""primary_key_values"": [
""1""
],
""units"": {},
""foreign_key_tables"": [
{
""other_table"": ""foreign_key_references"",
""column"": ""id"",
""other_column"": ""foreign_key_with_blank_label"",
""count"": 0,
""link"": ""/fixtures/foreign_key_references?foreign_key_with_blank_label=1""
},
{
""other_table"": ""foreign_key_references"",
""column"": ""id"",
""other_column"": ""foreign_key_with_label"",
""count"": 1,
""link"": ""/fixtures/foreign_key_references?foreign_key_with_label=1""
},
{
""other_table"": ""complex_foreign_keys"",
""column"": ""id"",
""other_column"": ""f3"",
""count"": 1,
""link"": ""/fixtures/complex_foreign_keys?f3=1""
},
{
""other_table"": ""complex_foreign_keys"",
""column"": ""id"",
""other_column"": ""f2"",
""count"": 0,
""link"": ""/fixtures/complex_foreign_keys?f2=1""
},
{
""other_table"": ""complex_foreign_keys"",
""column"": ""id"",
""other_column"": ""f1"",
""count"": 1,
""link"": ""/fixtures/complex_foreign_keys?f1=1""
}
],
""query_ms"": 4.226590999678592,
""source"": ""tests/fixtures.py"",
""source_url"": ""https://github.com/simonw/datasette/blob/main/tests/fixtures.py"",
""license"": ""Apache License 2.0"",
""license_url"": ""https://github.com/simonw/datasette/blob/main/LICENSE"",
""ok"": true,
""truncated"": false
}
```
That `?_extras=` should be `?_extra=` - plus the row JSON should be redesigned to fit the new default JSON representation.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1822939274,I_kwDOBm6k_c5sp9iK,2113,Implement and document extras for the new query view page,9599,open,0,,8755003,3,2023-07-26T18:24:01Z,2023-08-09T17:35:22Z,,OWNER,,- #2109 ,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1840417903,I_kwDOBm6k_c5tsoxv,2131,Refactor code that supports templates_considered comment,9599,open,0,,3268330,1,2023-08-08T01:28:36Z,2023-08-09T15:27:41Z,,OWNER,,"I ended up duplicating it here: https://github.com/simonw/datasette/blob/7532feb424b1dce614351e21b2265c04f9669fe2/datasette/views/database.py#L164-L167
I think it should move to `datasette.render_template()` - and maybe have a renamed template variable too.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2131/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1840324765,I_kwDOBm6k_c5tsSCd,2129,CSV ?sql= should indicate errors,9599,open,0,,3268330,1,2023-08-07T23:13:04Z,2023-08-08T02:02:21Z,,OWNER,,"> https://latest.datasette.io/_memory.csv?sql=select+blah is a blank page right now:
```bash
curl -I 'https://latest.datasette.io/_memory.csv?sql=select+blah'
```
```
HTTP/2 200
access-control-allow-origin: *
access-control-allow-headers: Authorization, Content-Type
access-control-expose-headers: Link
access-control-allow-methods: GET, POST, HEAD, OPTIONS
access-control-max-age: 3600
content-type: text/plain; charset=utf-8
x-databases: _memory, _internal, fixtures, fixtures2, extra_database, ephemeral
date: Mon, 07 Aug 2023 23:12:15 GMT
server: Google Frontend
```
_Originally posted by @simonw in https://github.com/simonw/datasette/issues/2118#issuecomment-1668688947_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
774332247,MDExOlB1bGxSZXF1ZXN0NTQ1MjY0NDM2,1159,Improve the display of facets information,552629,open,0,,3268330,9,2020-12-24T11:01:47Z,2023-07-31T18:57:59Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/1159,"This PR changes the display of facets to hopefully make them more readable.
Before | After
---|---
![image](https://user-images.githubusercontent.com/552629/103084609-b1ec2980-45df-11eb-85bc-68ab8df3e8d9.png) | ![image](https://user-images.githubusercontent.com/552629/103085220-620e6200-45e1-11eb-8189-5dd5d3e2569e.png)
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1159/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1824457306,I_kwDOBm6k_c5svwJa,2122,Parameters on canned queries: fixed or query-generated list?,1563881,open,0,,,0,2023-07-27T14:07:07Z,2023-07-27T14:07:07Z,,NONE,,"Hi,
currently parameters in canned queries are just text fields. It would be cool to have one of the options below. Would you accept a PR doing something in this direction? (Possibly this could even work as a plugin.)
* adding facets, which would work like facets on tables or views, giving a list of selectable options (and leaving parameters as is)
* making it possible to provide a query which returns selectable values for a parameter, e.g.
```
calendar_entries_current_instrument:
sql: |
select * from calendar_entries
where
DTEND_UNIX > UNIXEPOCH() and
DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and
current = 1 and
MACHINE = :instrument
order by
DTSTART_UNIX
params:
days:
sql: ""SELECT VALUE FROM generate_series(1, 30, 1)""
# this obviously requires the corresponding sqlite extension
instrument:
sql: ""SELECT DISTINCT MACHINE FROM calendar_entries""
```
* making it possible to provide a fixed list of parameters
```
calendar_entries_current_instrument:
sql: |
select * from calendar_entries
where
DTEND_UNIX > UNIXEPOCH() and
DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and
current = 1 and
MACHINE = :instrument
order by
DTSTART_UNIX
params:
days:
values: [1, 2, 3, 5, 10, 20, 30]
instrument:
values: [supermachine, crappymachine, boringmachine]
```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2122/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1823428714,I_kwDOBm6k_c5sr1Bq,2120,Add __all__ to datasette/__init__.py,9599,open,0,,,0,2023-07-27T01:07:10Z,2023-07-27T01:07:10Z,,OWNER,,"Currently looks like this: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/__init__.py#L1-L6
Adding `__all__ = [""Permission"", ""Forbidden""...]` would let me get rid of those `# noqa` comments.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1822813627,I_kwDOBm6k_c5spe27,2108,some (many?) SQL syntax errors are not throwing errors with a .csv endpoint,536941,open,0,,,0,2023-07-26T16:57:45Z,2023-07-26T16:58:07Z,,CONTRIBUTOR,,"here's a CTE query that should always fail with a syntax error:
```sql
with foo as (nonsense)
select
*
from
foo;
```
when we make this query against the default endpoint, we do indeed get a 400 status code the problem is returned to the user: https://global-power-plants.datasettes.com/global-power-plants?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B
but, if we use the csv endpoint, we get a 200 status code and no indication of a problem: https://global-power-plants.datasettes.com/global-power-plants.csv?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B
same with this bad sql
```sql
select
a,
from
foo;
```
https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B
vs
https://global-power-plants.datasettes.com/global-power-plants.csv?sql=select%0D%0A++a%2C%0D%0Afrom%0D%0A++foo%3B
but, datasette catches this bad sql at both endpoints:
```sql
slect
a
from
foo;
```
https://global-power-plants.datasettes.com/global-power-plants?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B
https://global-power-plants.datasettes.com/global-power-plants.csv?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1811824307,I_kwDOBm6k_c5r_j6z,2105,When reverse proxying datasette with nginx an URL element gets erronously added,2235371,open,0,,,3,2023-07-19T12:16:53Z,2023-07-21T21:17:09Z,,NONE,,"I use this nginx config:
```
location /datasette-llm {
return 302 /datasette-llm/;
}
location /datasette-llm/ {
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection ""Upgrade"";
proxy_http_version 1.1;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto https;
proxy_set_header X-Forwarded-Host $http_host;
proxy_set_header Host $host;
proxy_max_temp_file_size 0;
proxy_pass http://127.0.0.1:8001/datasette-llm/;
proxy_redirect http:// https://;
proxy_buffering off;
proxy_request_buffering off;
proxy_set_header Origin '';
client_max_body_size 0;
auth_basic ""datasette-llm"";
auth_basic_user_file /etc/nginx/custom-userdb;
}
```
Then I start datasette with this command:
```
datasette serve --setting base_url /datasette-llm/ $(llm logs path)
```
Everything else works right, except the links in ""This data as json, CSV"".
They get an extra URL element ""datasette-llm"" like this:
https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.json?sql=select+*+from+_llm_migrations
https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.csv?sql=select+*+from+_llm_migrations&_size=max
When I remove that extra ""datasette-llm"" from the URL, those links work too.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1808215339,I_kwDOBm6k_c5rxy0r,2104,Tables starting with an underscore should be treated as hidden,9599,open,0,,,2,2023-07-17T17:13:53Z,2023-07-18T22:41:37Z,,OWNER,,"Plugins can then take advantage of this pattern, for example:
- https://github.com/simonw/datasette-auth-tokens/pull/8",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1808116827,I_kwDOBm6k_c5rxaxb,2103,data attribute on Datasette tables exposing the primary key of the row,9599,open,0,,,0,2023-07-17T16:18:25Z,2023-07-17T16:18:25Z,,OWNER,,Maybe put it on the `` but probably better to go on the `td.type-pk`.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1765870617,I_kwDOBm6k_c5pQQwZ,2087,`--settings settings.json` option,9599,open,0,,,2,2023-06-20T17:48:45Z,2023-07-14T17:02:03Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120705940728066080
> May I add a request to the whole metadata / settings ? Allow to pass `--settings path/to/settings.json` instead of having to rely exclusively on directory mode to centralize settings (this would reflect the behavior of providing metadata)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2087/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1803264272,I_kwDOBm6k_c5re6EQ,2101,alter: true support for JSON write API,9599,open,0,,,1,2023-07-13T15:24:11Z,2023-07-13T15:24:18Z,,OWNER,,"Requested here: https://discord.com/channels/823971286308356157/823971286941302908/1129034187073134642
> The former datasette-insert plugin had an option `?alter=1` to auto-add new columns. Does the JSON write API also have this?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1794604602,PR_kwDOBm6k_c5U-akg,2096,Clarify docs for descriptions in metadata,15906,open,0,,,0,2023-07-08T01:57:58Z,2023-07-08T01:58:13Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2096,"G'day! I got confused while debugging, earlier today. That's on me, but it does strike me a little repetition in the metadata documentation might help those flicking around it rather than reading it from top to bottom. No worries if you think otherwise.
----
:books: Documentation preview :books:: https://datasette--2096.org.readthedocs.build/en/2096/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2096/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1794097871,I_kwDOBm6k_c5q78LP,2095,"Introduce ""dark mode"" CSS",3315059,open,0,,,0,2023-07-07T19:15:58Z,2023-07-07T19:15:58Z,,NONE,,Using [the CSS media query `prefers-color-scheme`](https://developer.mozilla.org/en-US/docs/Web/CSS/@media/prefers-color-scheme) we can provide a dark-mode version of Datasette,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2095/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1783304750,I_kwDOBm6k_c5qSxIu,2094,JS Plugin Hooks for the Code Editor,15178711,open,0,,,0,2023-07-01T00:51:57Z,2023-07-01T00:51:57Z,,CONTRIBUTOR,,"When #2052 merges, I'd like to add support to add extensions/functions to the Datasette code editor.
I'd eventually like to build a JS plugin for [`sqlite-docs`](https://github.com/asg017/sqlite-docs), to add things like:
- Inline documentation for tables/columns on hover
- Inline docs for custom functions that are loaded in
- More detailed autocomplete for tables/columns/functions
I did some hacking to see what this would look like, see here:
There can be a new hook that allows JS plugins to add new ""extension"" in the CodeMirror editorview here:
https://github.com/simonw/datasette/blob/8cd60fd1d899952f1153460469b3175465f33f80/datasette/static/cm-editor-6.0.1.js#L25
Will need some more planning. For example, the Codemirror bundle in Datasette has functions that we could re-export for plugins to use (so we don't load 2 version of `""@codemirror/autocomplete""`, for example. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2094/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1781005740,I_kwDOBm6k_c5qJ_2s,2090,Adopt ruff for linting,9599,open,0,,,2,2023-06-29T14:56:43Z,2023-06-29T15:05:04Z,,OWNER,,https://beta.ruff.rs/docs/,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2090/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1054244712,I_kwDOBm6k_c4-1n9o,1510,Datasette 1.0 documented template context (maybe via API docs),9599,open,0,,3268330,3,2021-11-15T23:23:58Z,2023-06-28T02:05:21Z,,OWNER,,Documented context plus protective unit tests. Goal is that custom templates built for 1.x will not break without a 2.x release.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
323223872,MDU6SXNzdWUzMjMyMjM4NzI=,260,Validate metadata.json on startup,9599,open,0,,,7,2018-05-15T13:42:56Z,2023-06-21T12:51:22Z,,OWNER,,"It's easy to misspell the name of a database or table and then be puzzled when the metadata settings silently fail.
To avoid this, let's sanity check the provided metadata.json on startup and quit with a useful error message if we find any obvious mistakes.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1764792125,I_kwDOBm6k_c5pMJc9,2086,Show information on startup in directory configuration mode,9599,open,0,,,0,2023-06-20T07:13:33Z,2023-06-20T07:13:33Z,,OWNER,,"https://discord.com/channels/823971286308356157/823971286941302908/1120516587036889098
> One thing that would be helpful would be message at launch indicating a metadata.json is getting picked up. I'm using directory mode and was editing the wrong file for awhile before I realize nothing I was doing was having any effect.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2086/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1762180409,I_kwDOBm6k_c5pCL05,2085,Interactive row selection in Datasette ,24938923,open,0,,,0,2023-06-18T08:29:45Z,2023-06-18T08:31:23Z,,NONE,,"Simon did a excellent [prototype](https://til.simonwillison.net/datasette/row-selection-prototype) of an interactive row selection in Datasette.
I hope this [functionality](https://camo.githubusercontent.com/3d4a0f31fb6a27fd279f809af5b53dc3b76faa63c7721e228951c5252b645a77/68747470733a2f2f7374617469632e73696d6f6e77696c6c69736f6e2e6e65742f7374617469632f323032332f6461746173657474652d7069636b65722e676966) can be turned into a Datasette plugin.
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2085/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1761613778,I_kwDOBm6k_c5pABfS,2084,Support facets for columns that contain timestamps,19492893,open,0,,,0,2023-06-17T03:33:54Z,2023-06-17T03:33:54Z,,NONE,,"
Django has this very nice filter for datetime fields -
It would be nice to have something similar to facet by a field that contains a timestamp in datasette too - Which doesn't seem to do anything with timestamps right now...
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2084/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1734786661,PR_kwDOBm6k_c5R0fcK,2082,Catch query interrupted on facet suggest row count,10843208,open,0,,,0,2023-05-31T18:42:46Z,2023-05-31T18:45:26Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2082,"Just like facet's `suggest()` is trapping `QueryInterrupted` for facet columns, we also need to trap `get_row_count()`, which can reach timeout if database tables are big enough.
I've included `get_columns()` inside the block as that's just another query, despite it's a really cheap one and might never raise the exception.
----
:books: Documentation preview :books:: https://datasette--2082.org.readthedocs.build/en/2082/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2082/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1727478903,I_kwDOBm6k_c5m9zx3,2081,Update Endpoints defined in metadata throws 403 Forbidden after a while,15085007,open,0,,,0,2023-05-26T11:52:30Z,2023-05-26T11:52:30Z,,NONE,,"Hello. I expose an endpoint to update `tasks`:
```
{
""title"": ""My Datasette Instance"",
""databases"": {
""tasks"": {
""queries"": {
""update_task"": {
""sql"": ""UPDATE tasks SET status = :status, result = :result, systemMessage = :systemMessage WHERE queueID = :queueID"",
""write"": true,
""on_success_message"": ""Task updated"",
""on_success_redirect"": ""/tasks/tasks.json"",
""on_error_message"": ""Task update failed"",
""on_error_redirect"": ""/tasks.json"",
""params"": [""queueID"", ""taskData"", ""status"", ""result"", ""systemMessage""]
}
}
}
}
}
```
This works really well! But after a while, the Datasette Instanz answers with **403 Forbidden**.
I have to delete the database and recreate it in order to work again.
Any help here? (´。_。`)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2081/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1715468032,PR_kwDOBm6k_c5QzEAM,2076,Datsette gpt plugin,130708713,open,0,,,0,2023-05-18T11:22:30Z,2023-05-18T11:22:45Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2076,"
----
:books: Documentation preview :books:: https://datasette--2076.org.readthedocs.build/en/2076/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2076/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1708981860,PR_kwDOBm6k_c5QdMea,2074,sort files by mtime,3919561,open,0,,,0,2023-05-14T15:25:15Z,2023-05-14T15:25:29Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2074,"serving multiple database files and getting tired by the default sort, changes so the sort order puts the latest changed databases to be on top of the list so don't have to scroll down, lazy as i am ;)
----
:books: Documentation preview :books:: https://datasette--2074.org.readthedocs.build/en/2074/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2074/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1708030220,I_kwDOBm6k_c5lznkM,2073,Faceting doesn't work against integer columns in views,9599,open,0,,,2,2023-05-12T18:20:10Z,2023-05-12T18:24:07Z,,OWNER,,"Spotted this issue here: https://til.simonwillison.net/datasette/baseline
I had to do this workaround:
```sql
create view baseline as select
_key,
spec,
'' || json_extract(status, '$.is_baseline') as is_baseline,
json_extract(status, '$.since') as baseline_since,
json_extract(status, '$.support.chrome') as baseline_chrome,
json_extract(status, '$.support.edge') as baseline_edge,
json_extract(status, '$.support.firefox') as baseline_firefox,
json_extract(status, '$.support.safari') as baseline_safari,
compat_features,
caniuse,
usage_stats,
status
from
[index]
```
I think the core issue here is that, against a table, `select * from x where integer_column = '1'` works correctly, due to some kind of column type conversion mechanism... but this mechanism doesn't work against views.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2073/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1698865182,I_kwDOBm6k_c5lQqAe,2069,[BUG] Cannot insert new data to deployed instance,31861128,open,0,,,1,2023-05-07T02:59:42Z,2023-05-07T03:17:35Z,,NONE,,"## Summary
Recently, I deployed an instance of datasette to Vercel with the following plugins:
- datasette-auth-tokens
- datasette-insert
With the above plugins, I was able to insert new data to local sqlite db. However, when it comes to the deployment on Vercel, things behave differently. I observed some errors from the logs console on Vercel:
```console
File ""/var/task/datasette/database.py"", line 179, in _execute_writes
conn = self.connect(write=True)
File ""/var/task/datasette/database.py"", line 93, in connect
assert not (write and not self.is_mutable)
AssertionError
```
I think it is a potential bug.
## Reproduce
metadata.json
```json
{
""plugins"": {
""datasette-insert"": {
""allow"": {
""id"": ""*""
}
},
""datasette-auth-tokens"": {
""tokens"": [
{
""token"": {
""$env"": ""INSERT_TOKEN""
},
""actor"": {
""id"": ""repeater""
}
}
],
""param"": ""_auth_token""
}
}
}
```
commands
```bash
# deploy
datasette publish vercel remote.db \
--project=repeater-bot-sqlite \
--metadata metadata.json \
--install datasette-auth-tokens \
--install datasette-insert \
--vercel-json=vercel.json
# test insert
cat fixtures/dogs.json | curl --request POST -d @- -H ""Authorization: Bearer "" \
'https://repeater-bot-sqlite.vercel.app/-/insert/remote/dogs?pk=id'
```
logs
```console
Traceback (most recent call last):
File ""/var/task/datasette/app.py"", line 1354, in route_path
response = await view(request, send)
File ""/var/task/datasette/app.py"", line 1500, in async_view_fn
response = await async_call_with_supported_arguments(
File ""/var/task/datasette/utils/__init__.py"", line 1005, in async_call_with_supported_arguments
return await fn(*call_with)
File ""/var/task/datasette_insert/__init__.py"", line 14, in insert_or_upsert
response = await insert_or_upsert_implementation(request, datasette)
File ""/var/task/datasette_insert/__init__.py"", line 91, in insert_or_upsert_implementation
table_count = await db.execute_write_fn(write_in_thread, block=True)
File ""/var/task/datasette/database.py"", line 167, in execute_write_fn
raise result
File ""/var/task/datasette/database.py"", line 179, in _execute_writes
conn = self.connect(write=True)
File ""/var/task/datasette/database.py"", line 93, in connect
assert not (write and not self.is_mutable)
AssertionError
```
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2069/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1690765434,I_kwDOBm6k_c5kxwh6,2067,Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6,39538958,open,0,,,1,2023-05-01T12:42:28Z,2023-05-03T00:16:03Z,,NONE,,"Hi! Wondering if this issue is limited to my local system or if it affects others as well.
It seems like 3.11 errors out on a ""litestream-restored"" database. On further investigation, it also appears to conk out on 3.10.8 but works on 3.10.7 and 3.10.6.
To demo issue I created a test database, replicated it to an aws s3 bucket, then restored the same under various .pyenv-versioned shells where I test whether I can read the database via the sqlite3 cli.
```sh
# create new shell with 3.11.3
litestream restore -o data/db.sqlite s3://mytestbucketxx/db
sqlite3 data/db.sqlite
# SQLite version 3.41.2 2023-03-22 11:56:21
# Enter "".help"" for usage hints.
# sqlite> .tables
# _litestream_lock _litestream_seq movie
# sqlite>
```
However this get me an `OperationalError` when reading via datasette:
Error on 3.11.3 and 3.10.8
```sh
datasette data/db.sqlite
```
```console
/tester/.venv/lib/python3.11/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API
warnings.warn(""pkg_resources is deprecated as an API"", DeprecationWarning)
Traceback (most recent call last):
File ""/tester/.venv/bin/datasette"", line 8, in
sys.exit(cli())
^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1130, in __call__
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1055, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 760, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 143, in wrapped
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 615, in serve
asyncio.get_event_loop().run_until_complete(check_databases(ds))
File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py"", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 660, in check_databases
await database.execute_fn(check_connection)
File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn
return await asyncio.get_event_loop().run_in_executor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/thread.py"", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread
return fn(conn)
^^^^^^^^
File ""/tester/.venv/lib/python3.11/site-packages/datasette/utils/__init__.py"", line 951, in check_connection
for r in conn.execute(
^^^^^^^^^^^^^
sqlite3.OperationalError: unable to open database file
```
Works on 3.10.7, 3.10.6
```sh
# create new shell with 3.10.7 / 3.10.6
litestream restore -o data/db.sqlite s3://mytestbucketxx/db
datasette data/db.sqlite
# ...
# INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
```
In both scenarios, the only dependencies were the pinned python version and the latest Datasette version 0.64.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2067/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1674322631,PR_kwDOBm6k_c5OpEz_,2061,"Add ""Packaging a plugin using Poetry"" section in docs",1238873,open,0,,,0,2023-04-19T07:23:28Z,2023-04-19T07:27:18Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2061,"This PR adds a new section about packaging a plugin using `poetry` within the ""Writing plugins"" page of the documentation.
----
:books: Documentation preview :books:: https://datasette--2061.org.readthedocs.build/en/2061/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2061/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1661860507,PR_kwDOBm6k_c5N_bMw,2056,GitHub Action to lint Python code with ruff,3709715,open,0,,,6,2023-04-11T06:41:27Z,2023-04-15T14:24:46Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2056,"[Ruff](https://beta.ruff.rs/) supports [over 500 lint rules](https://beta.ruff.rs/docs/rules) and can be used to replace [Flake8](https://pypi.org/project/flake8/) (plus dozens of plugins), [isort](https://pypi.org/project/isort/), [pydocstyle](https://pypi.org/project/pydocstyle/), [yesqa](https://github.com/asottile/yesqa), [eradicate](https://pypi.org/project/eradicate/), [pyupgrade](https://pypi.org/project/pyupgrade/), and [autoflake](https://pypi.org/project/autoflake/), all while executing (in Rust) tens or hundreds of times faster than any individual tool.
The ruff Action uses minimal steps to run in ~5 seconds, rapidly providing intuitive GitHub Annotations to contributors.
![image](https://user-images.githubusercontent.com/3709715/223758136-afc386d2-70aa-4eff-953a-2c2d82ceea23.png)
----
:books: Documentation preview :books:: https://datasette--2056.org.readthedocs.build/en/2056/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2056/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1663399821,I_kwDOBm6k_c5jJXeN,2058,"500 ""attempt to write a readonly database"" error caused by ""PRAGMA schema_version""",9599,open,0,,,9,2023-04-11T23:57:50Z,2023-04-13T16:35:21Z,,OWNER,,"I've not been able to replicate this myself yet, but I've seen log files from a user affected by it.
```
File ""/usr/local/lib/python3.11/site-packages/datasette/views/base.py"", line 89, in dispatch_request
await self.ds.refresh_schemas()
File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 371, in refresh_schemas
await self._refresh_schemas()
File ""/usr/local/lib/python3.11/site-packages/datasette/app.py"", line 386, in _refresh_schemas
schema_version = (await db.execute(""PRAGMA schema_version"")).first()[0]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 267, in execute
results = await self.execute_fn(sql_operation_in_thread)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn
return await asyncio.get_event_loop().run_in_executor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/usr/local/lib/python3.11/concurrent/futures/thread.py"", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread
return fn(conn)
^^^^^^^^
File ""/usr/local/lib/python3.11/site-packages/datasette/database.py"", line 237, in sql_operation_in_thread
cursor.execute(sql, params if params is not None else {})
sqlite3.OperationalError: attempt to write a readonly database
```
That's running the official Datasette Docker image on https://fly.io/ - it's causing 500 errors on every page of their site.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2058/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1665510265,I_kwDOBm6k_c5jRat5,2060,Clean up a bunch of warnings from ruff,9599,open,0,,,0,2023-04-13T01:23:02Z,2023-04-13T01:23:02Z,,OWNER,,"See:
- #2056
`ruff` spots a bunch of warnings about things like unused variables - would be good to clean up as many of these as possible.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2060/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1665053646,I_kwDOBm6k_c5jPrPO,2059,"""Deceptive site ahead"" alert on Heroku deployment",1186275,open,0,,,1,2023-04-12T18:34:51Z,2023-04-13T01:13:01Z,,NONE,,"I deployed a fairly basic instance of Datasette (`datasette-auth-passwords` is the only plugin) using Heroku. The deployed URL now gives a ""Deceptive site ahead"" warning to users.
Is there way around this? Maybe a way to add ownership verification [through Google's search console](https://search.google.com/search-console/welcome)? ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2059/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1657861026,I_kwDOBm6k_c5i0POi,2054,"Make detailed notes on how table, query and row views work right now",9599,open,0,,,13,2023-04-06T18:21:09Z,2023-04-07T20:14:38Z,,OWNER,,"Research to help influence the following:
- #2049
- #2053
- #2050
- #262 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2054/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1646734246,I_kwDOBm6k_c5iJyum,2049,Custom SQL queries should use new JSON ?_extra= format,9599,open,0,,8755003,4,2023-03-30T00:42:53Z,2023-04-05T23:29:27Z,,OWNER,,"Related:
- #262
I've made the change to the table view, now I need the new format to work for arbitrary SQL queries too.
Note that this incorporates both arbitrary SQL queries and canned queries.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2049/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1649791661,I_kwDOBm6k_c5iVdKt,2050,Row page JSON should use new ?_extra= format,9599,open,0,,8755003,1,2023-03-31T17:56:53Z,2023-03-31T17:59:49Z,,OWNER,,"https://latest.datasette.io/fixtures/facetable/2.json
Related:
- #2049
- #1709 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2050/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1649793525,I_kwDOBm6k_c5iVdn1,2051,`?_extra=row_urls` for table pages,9599,open,0,,,0,2023-03-31T17:58:36Z,2023-03-31T17:58:36Z,,OWNER,,Provides URLs to the JSON version of those rows. Maybe it persists the `?_shape=` option too? Not sure about that.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2051/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1615692818,I_kwDOBm6k_c5gTYQS,2035,Potential feature: special support for `?a=1&a=2` on the query page,9599,open,0,,3268330,14,2023-03-08T18:05:03Z,2023-03-31T16:09:08Z,,OWNER,,"From a discussion on Discord: https://discord.com/channels/823971286308356157/996877076982415491/1082789517062320138
The key idea is to make it easier for people to implement `where id in (...)` that's populated from query string arguments.
What if you could add `?id=11&id=32&id=62` to the URL and have that made available as a list that can be used in the query?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2035/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1531991339,I_kwDOBm6k_c5bUFUr,1989,Suggestion: Hiding columns,116795,open,0,,,3,2023-01-13T09:33:32Z,2023-03-31T06:18:05Z,,NONE,,As there's the possibility of [hiding tables](https://docs.datasette.io/en/stable/metadata.html#hiding-tables) - I've run into the **need of hiding specific columns** - data that's either not relevant for public or can't be shown due to privacy reasons. ,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1989/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1613974869,PR_kwDOBm6k_c5LgPS-,2034,remove an unused `app` var in cli.py,4370201,open,0,,,2,2023-03-07T18:19:05Z,2023-03-29T20:56:20Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2034,"this var `app` isn't actually used? unless init it does some side-effect outside of the event loop, idon't think it's necessary.
Feel free to ignore this PR if the deleted line actually does something.
----
:books: Documentation preview :books:: https://datasette--2034.org.readthedocs.build/en/2034/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2034/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1646068413,I_kwDOBm6k_c5iHQK9,2048,Test failures encountered while packaging for GNU Guix,8332263,open,0,,,0,2023-03-29T15:36:54Z,2023-03-29T15:36:54Z,,NONE,,"Hello,
While reviewing a packaged submitted to Guix to add `datasette`, the test suite produces the following errors:
```
=================================== FAILURES ===================================
_________________________ test_row_strange_table_name __________________________
[gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client =
def test_row_strange_table_name(app_client):
response = app_client.get(
""/fixtures/table~2Fwith~2Fslashes~2Ecsv/3.json?_shape=objects""
)
> assert response.status == 200
E assert 400 == 200
E + where 400 = .status
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:701: AssertionError
----------------------------- Captured stderr call -----------------------------
ERROR: conn=, sql = 'select rowid, * from [table%7E2Fwith%7E2Fslashes%7E2Ecsv] where ""rowid""=:p0', params = {'p0': '3'}: no such table: table%7E2Fwith%7E2Fslashes%7E2Ecsv
_______________ test_database_page_for_database_with_dot_in_name _______________
[gw15] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client_with_dot =
def test_database_page_for_database_with_dot_in_name(app_client_with_dot):
response = app_client_with_dot.get(""/fixtures~2Edot.json"")
> assert response.status == 200
E assert 302 == 200
E + where 302 = .status
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:633: AssertionError
___________________ test_tilde_encoded_database_names[fo%o] ____________________
[gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
db_name = 'fo%o'
@pytest.mark.asyncio
@pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d""))
async def test_tilde_encoded_database_names(db_name):
ds = Datasette()
ds.add_memory_database(db_name)
response = await ds.client.get(""/.json"")
assert db_name in response.json().keys()
path = response.json()[db_name][""path""]
# And the JSON for that database
response2 = await ds.client.get(path + "".json"")
> assert response2.status_code == 200
E assert 302 == 200
E + where 302 = .status_code
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError
__________________ test_tilde_encoded_database_names[f~/c.d] ___________________
[gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
db_name = 'f~/c.d'
@pytest.mark.asyncio
@pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d""))
async def test_tilde_encoded_database_names(db_name):
ds = Datasette()
ds.add_memory_database(db_name)
response = await ds.client.get(""/.json"")
assert db_name in response.json().keys()
path = response.json()[db_name][""path""]
# And the JSON for that database
response2 = await ds.client.get(path + "".json"")
> assert response2.status_code == 200
E assert 302 == 200
E + where 302 = .status_code
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError
______________ test_database_with_space_in_name[/searchable.json] ______________
[gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client_two_attached_databases =
path = '/searchable.json'
@pytest.mark.parametrize(
""path"",
(
""/"",
"".json"",
""/searchable"",
""/searchable.json"",
""/searchable_view"",
""/searchable_view.json"",
),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
> response = app_client_two_attached_databases.get(
""/extra~20database"" + path, follow_redirects=True
)
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__
return call_result.result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result
return self.__get_result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result
raise self._exception
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap
result = await self.awaitable(*args, **kwargs)
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get
return await self._request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request
httpx_response = await self.ds.client.request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request
return await client.request(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request
return await self.send(request, auth=auth, follow_redirects=follow_redirects)
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send
response = await self._send_handling_auth(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth
response = await self._send_handling_redirects(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self =
request =
follow_redirects = True
history = [, , , , , , ...]
async def _send_handling_redirects(
self,
request: Request,
follow_redirects: bool,
history: typing.List[Response],
) -> Response:
while True:
if len(history) > self.max_redirects:
> raise TooManyRedirects(
""Exceeded maximum allowed redirects."", request=request
)
E httpx.TooManyRedirects: Exceeded maximum allowed redirects.
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects
___________________ test_database_with_space_in_name[.json] ____________________
[gw19] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client_two_attached_databases =
path = '.json'
@pytest.mark.parametrize(
""path"",
(
""/"",
"".json"",
""/searchable"",
""/searchable.json"",
""/searchable_view"",
""/searchable_view.json"",
),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
> response = app_client_two_attached_databases.get(
""/extra~20database"" + path, follow_redirects=True
)
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__
return call_result.result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result
return self.__get_result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result
raise self._exception
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap
result = await self.awaitable(*args, **kwargs)
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get
return await self._request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request
httpx_response = await self.ds.client.request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request
return await client.request(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request
return await self.send(request, auth=auth, follow_redirects=follow_redirects)
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send
response = await self._send_handling_auth(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth
response = await self._send_handling_redirects(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self =
request =
follow_redirects = True
history = [, , , , , , ...]
async def _send_handling_redirects(
self,
request: Request,
follow_redirects: bool,
history: typing.List[Response],
) -> Response:
while True:
if len(history) > self.max_redirects:
> raise TooManyRedirects(
""Exceeded maximum allowed redirects."", request=request
)
E httpx.TooManyRedirects: Exceeded maximum allowed redirects.
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects
______________ test_database_with_space_in_name[/searchable_view] ______________
[gw22] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client_two_attached_databases =
path = '/searchable_view'
@pytest.mark.parametrize(
""path"",
(
""/"",
"".json"",
""/searchable"",
""/searchable.json"",
""/searchable_view"",
""/searchable_view.json"",
),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
> response = app_client_two_attached_databases.get(
""/extra~20database"" + path, follow_redirects=True
)
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__
return call_result.result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result
return self.__get_result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result
raise self._exception
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap
result = await self.awaitable(*args, **kwargs)
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get
return await self._request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request
httpx_response = await self.ds.client.request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request
return await client.request(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request
return await self.send(request, auth=auth, follow_redirects=follow_redirects)
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send
response = await self._send_handling_auth(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth
response = await self._send_handling_redirects(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self =
request =
follow_redirects = True
history = [, , , , , , ...]
async def _send_handling_redirects(
self,
request: Request,
follow_redirects: bool,
history: typing.List[Response],
) -> Response:
while True:
if len(history) > self.max_redirects:
> raise TooManyRedirects(
""Exceeded maximum allowed redirects."", request=request
)
E httpx.TooManyRedirects: Exceeded maximum allowed redirects.
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects
_____________________ test_database_with_space_in_name[/] ______________________
[gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client_two_attached_databases =
path = '/'
@pytest.mark.parametrize(
""path"",
(
""/"",
"".json"",
""/searchable"",
""/searchable.json"",
""/searchable_view"",
""/searchable_view.json"",
),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
> response = app_client_two_attached_databases.get(
""/extra~20database"" + path, follow_redirects=True
)
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__
return call_result.result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result
return self.__get_result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result
raise self._exception
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap
result = await self.awaitable(*args, **kwargs)
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get
return await self._request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request
httpx_response = await self.ds.client.request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request
return await client.request(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request
return await self.send(request, auth=auth, follow_redirects=follow_redirects)
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send
response = await self._send_handling_auth(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth
response = await self._send_handling_redirects(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self =
request =
follow_redirects = True
history = [, , , , , , ...]
async def _send_handling_redirects(
self,
request: Request,
follow_redirects: bool,
history: typing.List[Response],
) -> Response:
while True:
if len(history) > self.max_redirects:
> raise TooManyRedirects(
""Exceeded maximum allowed redirects."", request=request
)
E httpx.TooManyRedirects: Exceeded maximum allowed redirects.
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects
________________ test_database_with_space_in_name[/searchable] _________________
[gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client_two_attached_databases =
path = '/searchable'
@pytest.mark.parametrize(
""path"",
(
""/"",
"".json"",
""/searchable"",
""/searchable.json"",
""/searchable_view"",
""/searchable_view.json"",
),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
> response = app_client_two_attached_databases.get(
""/extra~20database"" + path, follow_redirects=True
)
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__
return call_result.result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result
return self.__get_result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result
raise self._exception
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap
result = await self.awaitable(*args, **kwargs)
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get
return await self._request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request
httpx_response = await self.ds.client.request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request
return await client.request(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request
return await self.send(request, auth=auth, follow_redirects=follow_redirects)
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send
response = await self._send_handling_auth(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth
response = await self._send_handling_redirects(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self =
request =
follow_redirects = True
history = [, , , , , , ...]
async def _send_handling_redirects(
self,
request: Request,
follow_redirects: bool,
history: typing.List[Response],
) -> Response:
while True:
if len(history) > self.max_redirects:
> raise TooManyRedirects(
""Exceeded maximum allowed redirects."", request=request
)
E httpx.TooManyRedirects: Exceeded maximum allowed redirects.
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects
___________ test_database_with_space_in_name[/searchable_view.json] ____________
[gw23] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client_two_attached_databases =
path = '/searchable_view.json'
@pytest.mark.parametrize(
""path"",
(
""/"",
"".json"",
""/searchable"",
""/searchable.json"",
""/searchable_view"",
""/searchable_view.json"",
),
)
def test_database_with_space_in_name(app_client_two_attached_databases, path):
> response = app_client_two_attached_databases.get(
""/extra~20database"" + path, follow_redirects=True
)
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__
return call_result.result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result
return self.__get_result()
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result
raise self._exception
/gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap
result = await self.awaitable(*args, **kwargs)
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get
return await self._request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request
httpx_response = await self.ds.client.request(
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request
return await client.request(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request
return await self.send(request, auth=auth, follow_redirects=follow_redirects)
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send
response = await self._send_handling_auth(
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth
response = await self._send_handling_redirects(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self =
request =
follow_redirects = True
history = [, , , , , , ...]
async def _send_handling_redirects(
self,
request: Request,
follow_redirects: bool,
history: typing.List[Response],
) -> Response:
while True:
if len(history) > self.max_redirects:
> raise TooManyRedirects(
""Exceeded maximum allowed redirects."", request=request
)
E httpx.TooManyRedirects: Exceeded maximum allowed redirects.
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects
________________ test_weird_database_names[database (1).sqlite] ________________
[gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw7/test_weird_database_names_data0')
filename = 'database (1).sqlite'
@pytest.mark.parametrize(
""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""]
)
def test_weird_database_names(tmpdir, filename):
# https://github.com/simonw/datasette/issues/1181
runner = CliRunner()
db_path = str(tmpdir / filename)
sqlite3.connect(db_path).execute(""vacuum"")
result1 = runner.invoke(cli, [db_path, ""--get"", ""/""])
assert result1.exit_code == 0, result1.output
filename_no_stem = filename.rsplit(""."", 1)[0]
expected_link = '{}'.format(
tilde_encode(filename_no_stem), filename_no_stem
)
assert expected_link in result1.output
# Now try hitting that database page
result2 = runner.invoke(
cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))]
)
> assert result2.exit_code == 0, result2.output
E AssertionError:
E
E assert 1 == 0
E + where 1 = .exit_code
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError
_____________ test_weird_database_names[test-database (1).sqlite] ______________
[gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw6/test_weird_database_names_test0')
filename = 'test-database (1).sqlite'
@pytest.mark.parametrize(
""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""]
)
def test_weird_database_names(tmpdir, filename):
# https://github.com/simonw/datasette/issues/1181
runner = CliRunner()
db_path = str(tmpdir / filename)
sqlite3.connect(db_path).execute(""vacuum"")
result1 = runner.invoke(cli, [db_path, ""--get"", ""/""])
assert result1.exit_code == 0, result1.output
filename_no_stem = filename.rsplit(""."", 1)[0]
expected_link = '{}'.format(
tilde_encode(filename_no_stem), filename_no_stem
)
assert expected_link in result1.output
# Now try hitting that database page
result2 = runner.invoke(
cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))]
)
> assert result2.exit_code == 0, result2.output
E AssertionError:
E
E assert 1 == 0
E + where 1 = .exit_code
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError
_ test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] _
[gw11] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client =
path = '/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd'
expected = [['a/b | ', '.c-d | ', 'c | ']]
@pytest.mark.parametrize(
""path,expected"",
(
(
""/fixtures/compound_primary_key/a,b"",
[
[
'a | ',
'b | ',
'c | ',
]
],
),
(
""/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd"",
[
[
'a/b | ',
'.c-d | ',
'c | ',
]
],
),
),
)
def test_row_html_compound_primary_key(app_client, path, expected):
response = app_client.get(path)
> assert response.status == 200
E assert 302 == 200
E + where 302 = .status
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:370: AssertionError
_ test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] _
[gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client =
path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv'
expected_classes = ['table', 'db-fixtures', 'table-tablewithslashescsv-fa7563']
@pytest.mark.parametrize(
""path,expected_classes"",
[
(""/"", [""index""]),
(""/fixtures"", [""db"", ""db-fixtures""]),
(""/fixtures?sql=select+1"", [""query"", ""db-fixtures""]),
(
""/fixtures/simple_primary_key"",
[""table"", ""db-fixtures"", ""table-simple_primary_key""],
),
(
""/fixtures/neighborhood_search"",
[""query"", ""db-fixtures"", ""query-neighborhood_search""],
),
(
""/fixtures/table~2Fwith~2Fslashes~2Ecsv"",
[""table"", ""db-fixtures"", ""table-tablewithslashescsv-fa7563""],
),
(
""/fixtures/simple_primary_key/1"",
[""row"", ""db-fixtures"", ""table-simple_primary_key""],
),
],
)
def test_css_classes_on_body(app_client, path, expected_classes):
response = app_client.get(path)
> assert response.status == 200
E assert 302 == 200
E + where 302 = .status
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:238: AssertionError
_ test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] _
[gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client =
path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv'
expected_considered = 'table-fixtures-tablewithslashescsv-fa7563.html, *table.html'
@pytest.mark.parametrize(
""path,expected_considered"",
[
(""/"", ""*index.html""),
(""/fixtures"", ""database-fixtures.html, *database.html""),
(
""/fixtures/simple_primary_key"",
""table-fixtures-simple_primary_key.html, *table.html"",
),
(
""/fixtures/table~2Fwith~2Fslashes~2Ecsv"",
""table-fixtures-tablewithslashescsv-fa7563.html, *table.html"",
),
(
""/fixtures/simple_primary_key/1"",
""row-fixtures-simple_primary_key.html, *row.html"",
),
],
)
def test_templates_considered(app_client, path, expected_considered):
response = app_client.get(path)
> assert response.status == 200
E assert 302 == 200
E + where 302 = .status
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:264: AssertionError
_ test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] _
[gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client =
path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv'
expected = 'http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json'
@pytest.mark.parametrize(
""path,expected"",
(
# Instance index page
(""/"", ""http://localhost/.json""),
# Table page
(""/fixtures/facetable"", ""http://localhost/fixtures/facetable.json""),
(
""/fixtures/table~2Fwith~2Fslashes~2Ecsv"",
""http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json"",
),
# Row page
(
""/fixtures/no_primary_key/1"",
""http://localhost/fixtures/no_primary_key/1.json"",
),
# Database index page
(
""/fixtures"",
""http://localhost/fixtures.json"",
),
# Custom query page
(
""/fixtures?sql=select+*+from+facetable"",
""http://localhost/fixtures.json?sql=select+*+from+facetable"",
),
# Canned query page
(
""/fixtures/neighborhood_search?text=town"",
""http://localhost/fixtures/neighborhood_search.json?text=town"",
),
# /-/ pages
(
""/-/plugins"",
""http://localhost/-/plugins.json"",
),
),
)
def test_alternate_url_json(app_client, path, expected):
response = app_client.get(path)
> assert response.status == 200
E assert 302 == 200
E + where 302 = .status
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:948: AssertionError
_ test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] _
[gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client =
path = '/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC'
expected = '/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B'
@pytest.mark.parametrize(
""path,expected"",
[
(
""/fixtures/neighborhood_search"",
""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text="",
),
(
""/fixtures/neighborhood_search?text=ber"",
""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text=ber"",
),
(""/fixtures/pragma_cache_size"", None),
(
# /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬
""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC"",
""/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B"",
),
(""/fixtures/magic_parameters"", None),
],
)
def test_edit_sql_link_on_canned_queries(app_client, path, expected):
response = app_client.get(path)
> assert response.status == 200
E assert 302 == 200
E + where 302 = .status
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:841: AssertionError
_______________________ test_table_with_slashes_in_name ________________________
[gw9] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client =
def test_table_with_slashes_in_name(app_client):
response = app_client.get(
""/fixtures/table~2Fwith~2Fslashes~2Ecsv.json?_shape=objects""
)
> assert response.status == 200
E assert 302 == 200
E + where 302 = .status
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:141: AssertionError
__________________ test_custom_query_with_unicode_characters ___________________
[gw8] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client =
def test_custom_query_with_unicode_characters(app_client):
# /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬.json
response = app_client.get(
""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC.json?_shape=array""
)
> assert [{""id"": 1, ""name"": ""San Francisco""}] == response.json
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:1042:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:40: in json
return json.loads(self.text)
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/__init__.py:346: in loads
return _default_decoder.decode(s)
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:337: in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = , s = '', idx = 0
def raw_decode(self, s, idx=0):
""""""Decode a JSON document from ``s`` (a ``str`` beginning with
a JSON document) and return a 2-tuple of the Python
representation and the index in ``s`` where the document ended.
This can be used to decode a JSON document from a string that may
have extraneous data at the end.
""""""
try:
obj, end = self.scan_once(s, idx)
except StopIteration as err:
> raise JSONDecodeError(""Expecting value"", s, err.value) from None
E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:355: JSONDecodeError
_ test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] _
[gw13] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
app_client =
path = '/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw'
expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]
@pytest.mark.parametrize(
""path,expected_rows"",
[
(
""/fixtures/searchable.json?_search=dog"",
[
[1, ""barry cat"", ""terry dog"", ""panther""],
[2, ""terry dog"", ""sara weasel"", ""puma""],
],
),
(
# Special keyword shouldn't break FTS query
""/fixtures/searchable.json?_search=AND"",
[],
),
(
# Without _searchmode=raw this should return no results
""/fixtures/searchable.json?_search=te*+AND+do*"",
[],
),
(
# _searchmode=raw
""/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw"",
[
[1, ""barry cat"", ""terry dog"", ""panther""],
[2, ""terry dog"", ""sara weasel"", ""puma""],
],
),
(
# _searchmode=raw combined with _search_COLUMN
""/fixtures/searchable.json?_search_text2=te*&_searchmode=raw"",
[
[1, ""barry cat"", ""terry dog"", ""panther""],
],
),
(
""/fixtures/searchable.json?_search=weasel"",
[[2, ""terry dog"", ""sara weasel"", ""puma""]],
),
(
""/fixtures/searchable.json?_search_text2=dog"",
[[1, ""barry cat"", ""terry dog"", ""panther""]],
),
(
""/fixtures/searchable.json?_search_name%20with%20.%20and%20spaces=panther"",
[[1, ""barry cat"", ""terry dog"", ""panther""]],
),
],
)
def test_searchable(app_client, path, expected_rows):
response = app_client.get(path)
> assert expected_rows == response.json[""rows""]
E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == []
E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther']
E Full diff:
E [
E - ,
E + [1,
E + 'barry cat',
E + 'terry dog',
E + 'panther'],
E + [2,
E + 'terry dog',
E + 'sara weasel',
E + 'puma'],
E ]
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:402: AssertionError
_____ test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] ______
[gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
table_metadata = {'searchmode': 'raw'}, querystring = '_search=te*+AND+do*'
expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]
@pytest.mark.parametrize(
""table_metadata,querystring,expected_rows"",
[
(
{},
""_search=te*+AND+do*"",
[],
),
(
{""searchmode"": ""raw""},
""_search=te*+AND+do*"",
_SEARCHMODE_RAW_RESULTS,
),
(
{},
""_search=te*+AND+do*&_searchmode=raw"",
_SEARCHMODE_RAW_RESULTS,
),
# Can be over-ridden with _searchmode=escaped
(
{""searchmode"": ""raw""},
""_search=te*+AND+do*&_searchmode=escaped"",
[],
),
],
)
def test_searchmode(table_metadata, querystring, expected_rows):
with make_app_client(
metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}}
) as client:
response = client.get(""/fixtures/searchable.json?"" + querystring)
> assert expected_rows == response.json[""rows""]
E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == []
E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther']
E Full diff:
E [
E - ,
E + [1,
E + 'barry cat',
E + 'terry dog',
E + 'panther'],
E + [2,
E + 'terry dog',
E + 'sara weasel',
E + 'puma'],
E ]
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError
_ test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] _
[gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python
table_metadata = {}, querystring = '_search=te*+AND+do*&_searchmode=raw'
expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]
@pytest.mark.parametrize(
""table_metadata,querystring,expected_rows"",
[
(
{},
""_search=te*+AND+do*"",
[],
),
(
{""searchmode"": ""raw""},
""_search=te*+AND+do*"",
_SEARCHMODE_RAW_RESULTS,
),
(
{},
""_search=te*+AND+do*&_searchmode=raw"",
_SEARCHMODE_RAW_RESULTS,
),
# Can be over-ridden with _searchmode=escaped
(
{""searchmode"": ""raw""},
""_search=te*+AND+do*&_searchmode=escaped"",
[],
),
],
)
def test_searchmode(table_metadata, querystring, expected_rows):
with make_app_client(
metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}}
) as client:
response = client.get(""/fixtures/searchable.json?"" + querystring)
> assert expected_rows == response.json[""rows""]
E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == []
E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther']
E Full diff:
E [
E - ,
E + [1,
E + 'barry cat',
E + 'terry dog',
E + 'panther'],
E + [2,
E + 'terry dog',
E + 'sara weasel',
E + 'puma'],
E ]
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError
=========================== short test summary info ============================
FAILED tests/test_api.py::test_row_strange_table_name - assert 400 == 200
FAILED tests/test_api.py::test_database_page_for_database_with_dot_in_name - ...
FAILED tests/test_api.py::test_tilde_encoded_database_names[fo%o] - assert 30...
FAILED tests/test_api.py::test_tilde_encoded_database_names[f~/c.d] - assert ...
FAILED tests/test_api.py::test_database_with_space_in_name[/searchable.json]
FAILED tests/test_api.py::test_database_with_space_in_name[.json] - httpx.Too...
FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view]
FAILED tests/test_api.py::test_database_with_space_in_name[/] - httpx.TooMany...
FAILED tests/test_api.py::test_database_with_space_in_name[/searchable] - htt...
FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view.json]
FAILED tests/test_cli.py::test_weird_database_names[database (1).sqlite] - As...
FAILED tests/test_cli.py::test_weird_database_names[test-database (1).sqlite]
FAILED tests/test_html.py::test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1]
FAILED tests/test_html.py::test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5]
FAILED tests/test_html.py::test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html]
FAILED tests/test_html.py::test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json]
FAILED tests/test_html.py::test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B]
FAILED tests/test_table_api.py::test_table_with_slashes_in_name - assert 302 ...
FAILED tests/test_table_api.py::test_custom_query_with_unicode_characters - j...
FAILED tests/test_table_api.py::test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]
FAILED tests/test_table_api.py::test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1]
FAILED tests/test_table_api.py::test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2]
=========== 22 failed, 1049 passed, 3 skipped in 1522.28s (0:25:22) ============
error: in phase 'check': uncaught exception:
%exception #<&invoke-error program: ""/gnu/store/ziqwkzz6znb5d3c245xn0cq5ra2ly0w3-python-pytest-7.1.3/bin/pytest"" arguments: (""-vv"" ""-n"" ""24"" ""-m"" ""not serial"") exit-status: 1 term-signal: #f stop-signal: #f>
phase `check' failed after 1523.3 seconds
```
The tests run in a private namespace without internet connectivity, and the Python dependencies are at:
```
python-aiofiles@0.6.0 python-asgi-csrf@0.9 python-asgiref@3.4.1
+ python-beautifulsoup4@4.11.1 python-black@22.3.0 python-click-default-group@1.2.2 python-click@8.1.3
+ python-cogapp@3.3.0 python-httpx@0.23.0 python-hupper@1.10.3 python-itsdangerous@2.0.1
+ python-janus@1.0.0 python-jinja2@3.1.1 python-mergedeep@1.3.4 python-pint@0.20.1 python-pluggy@1.0.0
+ python-pytest-asyncio@0.17.2 python-pytest-runner@5.2 python-pytest-timeout@2.0.2
+ python-pytest-xdist@2.5.0 python-pytest@7.1.3 python-pyyaml@6.0 python-setuptools@64.0.3
+ python-trustme@0.9.0 python-uvicorn@0.17.6
```
With Python 3.9.9.
Thank you!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2048/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
323658641,MDU6SXNzdWUzMjM2NTg2NDE=,262,Add ?_extra= mechanism for requesting extra properties in JSON,9599,open,0,,3268330,27,2018-05-16T14:55:42Z,2023-03-29T06:22:22Z,,OWNER,,"Datasette views currently work by creating a set of data that should be returned as JSON, then defining an additional, optional `template_data()` function which is called if the view is being rendered as HTML.
This `template_data()` function calculates extra template context variables which are necessary for the HTML view but should not be included in the JSON.
Example of how that is used today: https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/table.py#L672-L704
With features like Facets in #255 I'm beginning to want to move more items into the `template_data()` - in the case of facets it's the `suggested_facets` array. This saves that feature from being calculated (involving several SQL queries) for the JSON case where it is unlikely to be used.
But... as an API user, I want to still optionally be able to access that information.
Solution: Add a `?_extra=suggested_facets&_extra=table_metadata` argument which can be used to optionally request additional blocks to be added to the JSON API.
Then redefine as many of the current `template_data()` features as extra arguments instead, and teach Datasette to return certain extras by default when rendering templates.
This could allow the JSON representation to be slimmed down further (removing e.g. the `table_definition` and `view_definition` keys) while still making that information available to API users who need it.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/262/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1641013220,I_kwDOBm6k_c5hz9_k,2045,First column on a view page has no facet option in cog menu,9599,open,0,,3268330,0,2023-03-26T18:02:47Z,2023-03-26T18:02:48Z,,OWNER,,"e.g. first column on this page - cog menu has no option to facet.
https://datasette.io/content/tools
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2045/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1639873822,PR_kwDOBm6k_c5M29tt,2044,Expand labels in row view as well (patch for 0.64.x branch),82332573,open,0,,,0,2023-03-24T18:44:44Z,2023-03-24T18:44:57Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2044,"This is a version of #2031 for the 0.64.x branch.
----
:books: Documentation preview :books:: https://datasette--2044.org.readthedocs.build/en/2044/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2044/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1605481359,PR_kwDOBm6k_c5LDwrF,2031,Expand foreign key references in row view as well,82332573,open,0,,,5,2023-03-01T18:43:09Z,2023-03-24T18:35:25Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/2031,"Unlike the table view, the single row view does not resolve foreign key references into labels. This patch extracts the foreign key reference expansion code from TableView.data() into a standalone function that is then called by both TableView.data() and RowView.data().
----
:books: Documentation preview :books:: https://datasette--2031.org.readthedocs.build/en/2031/
",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2031/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
1121583414,I_kwDOBm6k_c5C2gE2,1619,JSON link on row page is 404 if base_url setting is used,9599,open,0,,,5,2022-02-02T07:09:53Z,2023-03-24T15:38:04Z,,OWNER,,"On my local environment:
datasette fixtures.db -p 3344 --setting base_url /foo/bar/
Then hit http://127.0.0.1:3344/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3
But... that `json` link goes here, which is a 404:
http://127.0.0.1:3344/foo/bar/foo/bar/fixtures/table%2Fwith%2Fslashes.csv/3?_format=json",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1619/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1636616315,I_kwDOBm6k_c5hjMh7,2042,Gather feedback on new ?_extra= design,9599,open,0,,,0,2023-03-22T23:07:43Z,2023-03-22T23:08:19Z,,OWNER,,"Now that I've landed:
- #1999
See also:
- #262
I want to get some feedback from people on the design of the new `?_extra=` feature, before freezing it into Datasette 1.0.
The big change is that the default JSON representation is now MUCH slimmer - it only gives you keys for `""next""` and `""rows""`, where rows is a list of JSON objects (not a list of arrays as was previously the default) - for example https://latest.datasette.io/fixtures/sortable.json
If you want extra stuff you can ask for it with the new `?_extra=` parameter - e.g. https://latest.datasette.io/fixtures/sortable.json?_extra=columns&_extra=suggested_facets
You can use `?_extra=extras` to see a list of available extras: https://latest.datasette.io/fixtures/sortable.json?_extra=extras
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2042/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1620515757,I_kwDOBm6k_c5glxut,2039,Subtle bug with `--load-extension` and `--static` flags with absolute Windows paths with`C:\`,15178711,open,0,,,0,2023-03-12T21:18:52Z,2023-03-12T21:18:52Z,,CONTRIBUTOR,,"From the Datasette discord: A user tried running the following command on windows:
```
datasette --load-extension=""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll""
```
This failed with `""The specified module could not be found""`, because the entrypoint option introduced in #1789 splits the input differently. Instead of loading the extension found at `""C:\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll""`, it instead tried to load the extension at `""C""` with entrypoint `""\spatialite\mod_spatialite-5.0.1-win-x86\mod_spatialite.dll"".
This is hard because most absolute windows paths have a colon in them, like `C:\foo.txt` or `D:\bar.txt`. I'd image the `--static` flag is also vulnerable to this type of bug.
The ""solution"" is to use a relative path instead, but that doesn't feel that great. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2039/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
317001500,MDU6SXNzdWUzMTcwMDE1MDA=,236,datasette publish lambda plugin,9599,open,0,,,11,2018-04-23T22:10:30Z,2023-03-12T14:04:15Z,,OWNER,,"Refs #217 - create a publish plugin that can deploy to AWS Lambda.
https://docs.aws.amazon.com/lambda/latest/dg/limits.html says lambda packages can be up to 50 MB, so this would only work with smaller databases (the command can check the filesize before attempting to package and deploy it).
Lambdas do get a 512 MB `/tmp` directory too, so for larger databases the function could start and then download up to 512MB from an S3 bucket - so the plugin could take an optional S3 bucket to write to and know how to upload the `.db` file there and then have the lambda download it on startup.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/236/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1618249044,I_kwDOBm6k_c5gdIVU,2038,Consider a `strict_templates` setting,9599,open,0,,,2,2023-03-10T02:09:13Z,2023-03-10T02:11:06Z,,OWNER,,"A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error.
Prototype here:
```diff
diff --git a/datasette/app.py b/datasette/app.py
index 40416713..1428a3f0 100644
--- a/datasette/app.py
+++ b/datasette/app.py
@@ -200,6 +200,7 @@ SETTINGS = (
""Allow display of SQL trace debug information with ?_trace=1"",
),
Setting(""base_url"", ""/"", ""Datasette URLs should use this base path""),
+ Setting(""strict_templates"", False, ""Raise errors for undefined template variables""),
)
_HASH_URLS_REMOVED = ""The hash_urls setting has been removed, try the datasette-hashed-urls plugin instead""
OBSOLETE_SETTINGS = {
@@ -399,11 +400,14 @@ class Datasette:
),
]
)
+ env_extras = {}
+ if self.setting(""strict_templates""):
+ env_extras[""undefined""] = StrictUndefined
self.jinja_env = Environment(
loader=template_loader,
autoescape=True,
enable_async=True,
- undefined=StrictUndefined,
+ **env_extras,
)
self.jinja_env.filters[""escape_css_string""] = escape_css_string
self.jinja_env.filters[""quote_plus""] = urllib.parse.quote_plus
```
Explored this idea a bit in:
- #1999",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2038/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1590183272,I_kwDOBm6k_c5eyEVo,2027,"How to redirect from ""/"" to a specific db/table",1350673,open,0,,,4,2023-02-18T03:14:01Z,2023-03-08T04:42:22Z,,NONE,,"Using nginx to redirect public IP to the local uvicorn server as 'normal'. I can't figure out how to redirect such that '/' results in accessing the one db/table I want to serve; redirecting / to /db/table breaks some of the CSS; fooling with base_url doesn't seem to help. Can someone explain this, if it's possible?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2027/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
1605959201,I_kwDOBm6k_c5fuP4h,2032,datasette errors when foreign key integrity is enabled,193185,open,0,,,0,2023-03-02T01:27:51Z,2023-03-02T01:31:58Z,,CONTRIBUTOR,,"By default, [SQLite does not enforce foreign key constraints](https://www.sqlite.org/foreignkeys.html#fk_enable). I typically enable these checks by running:
```sql
PRAGMA foreign_keys = ON;
```
inside of a `prepare_connection` hook.
If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with:
```
FOREIGN KEY constraint failed
```
This could be resolved by either:
- deleting from the `tables` column last
- changing the schema so that the foreign keys have [ON DELETE CASCADE](https://www.sqlite.org/foreignkeys.html#fk_actions)
Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in `prepare_connection` and trying not to enable fkey checks on the `_internal` database.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2032/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,