home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

9 rows where type = "pull" and user = 15178711 sorted by updated_at descending

✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, draft, created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 7
  • open 2

repo 2

  • datasette 8
  • sqlite-utils 1

type 1

  • pull · 9 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
1901483874 PR_kwDOBm6k_c5amULw 2190 Raise an exception if a "plugins" block exists in metadata.json asg017 15178711 closed 0     5 2023-09-18T18:08:56Z 2023-10-12T16:20:51Z 2023-10-12T16:20:51Z CONTRIBUTOR simonw/datasette/pulls/2190

refs #2183 #2093

From this comment in #2183: If a "plugins" block appears in metadata.json, it means that a user hasn't migrated over their plugin configuration from metadata.json to datasette.yaml, which is a breaking change in Datasette 1.0.

This PR will ensure that an error is raised whenever that happens.


:books: Documentation preview :books:: https://datasette--2190.org.readthedocs.build/en/2190/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2190/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1901768721 PR_kwDOBm6k_c5anSg5 2191 Move `permissions`, `allow` blocks, canned queries and more out of `metadata.yaml` and into `datasette.yaml` asg017 15178711 closed 0     4 2023-09-18T21:21:16Z 2023-10-12T16:16:38Z 2023-10-12T16:16:38Z CONTRIBUTOR simonw/datasette/pulls/2191

The PR moves the following fields from metadata.yaml to datasette.yaml:

permissions allow allow_sql queries extra_css_urls extra_js_urls

This is a significant breaking change that users will need to upgrade their metadata.yaml files for. But the format/locations are similar to the previous version, so it shouldn't be too difficult to upgrade.

One note: I'm still working on the Configuration docs, specifically the "reference" section. Though it's pretty small, the rest of read to review

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2191/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1891212159 PR_kwDOBm6k_c5aD33C 2183 `datasette.yaml` plugin support asg017 15178711 closed 0     4 2023-09-11T20:26:04Z 2023-09-13T21:06:25Z 2023-09-13T21:06:25Z CONTRIBUTOR simonw/datasette/pulls/2183

Part of #2093

In #2149 , we ported over "settings.json" into the new datasette.yaml config file, with a top-level "settings" key. This PR ports over plugin configuration into top-level "plugins" key, as well as nested database/table plugin config.

From now on, no plugin-related configuration is allowed in metadata.yaml, and must be in datasette.yaml in this new format. This is a pretty significant breaking change. Thankfully, you should be able to copy-paste your legacy plugin key/values into the new datasette.yaml format.

An example of what datasette.yaml would look like with this new plugin config:

```yaml

plugins: datasette-my-plugin: config_key: value

databases: fixtures: plugins: datasette-my-plugin: config_key: fixtures-db-value tables: students: plugins: datasette-my-plugin: config_key: fixtures-students-table-value

```

As an additional benefit, this now works with the new -s flag:

bash datasette --memory -s 'plugins.datasette-my-plugin.config_key' new_value

Marked as a "Draft" right now until I add better documentation. We also should have a plan for the next alpha release to document and publicize this change, especially for plugin authors (since their docs will have to change to say datasette.yaml instead of metadata.yaml


:books: Documentation preview :books:: https://datasette--2183.org.readthedocs.build/en/2183/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2183/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1884330740 PR_kwDOBm6k_c5ZszDF 2174 Use $DATASETTE_INTERNAL in absence of --internal asg017 15178711 open 0     3 2023-09-06T16:07:15Z 2023-09-08T00:46:13Z   CONTRIBUTOR simonw/datasette/pulls/2174

refs 2157, specifically this comment

Passing in --internal my_internal.db over and over again can get repetitive.

This PR adds a new configurable env variable DATASETTE_INTERNAL_DB_PATH. If it's defined, then it takes place as the path to the internal database. Users can still overwrite this behavior by passing in their own --internal internal.db flag.

In draft mode for now, needs tests and documentation.

Side note: Maybe we can have a sections in the docs that lists all the "configuration environment variables" that Datasette respects? I did a quick grep and found:

  • DATASETTE_LOAD_PLUGINS
  • DATASETTE_SECRETS

:books: Documentation preview :books:: https://datasette--2174.org.readthedocs.build/en/2174/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2174/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1870672704 PR_kwDOBm6k_c5Y-7Em 2162 Add new `--internal internal.db` option, deprecate legacy `_internal` database asg017 15178711 closed 0     4 2023-08-29T00:05:07Z 2023-08-29T03:24:23Z 2023-08-29T03:24:23Z CONTRIBUTOR simonw/datasette/pulls/2162

refs #2157

This PR adds a new --internal option to datasette serve. If provided, it is the path to a persistent internal database that Datasette core and Datasette plugins can use to store data, as discussed in the proposal issue.

This PR also removes and deprecates the previous in-memory _internal database. Those tables now appear in the internal database, with core_ prefixes (ex tables in _internal is now core_tables in internal).

A note on the new core_ tables

However, one important notes about those new core_ tables: If a --internal DB is passed in, that means those core_ tables will persist across multiple Datasette instances. This wasn't the case before, since _internal was always an in-memory database created from scratch.

I tried to put those core_ tables as TEMP tables - after all, there's always one 1 internal DB connection at a time, so I figured it would work. But, since we use the Database() wrapper for the internal DB, it has two separate connections: a default read-only connection and a write connection that is created when a write operation occurs. Which meant the TEMP tables would be created by the write connection, but not available in the read-only connection.

So I had a brillant idea: Attach an in-memory named database with cache=shared, and create those tables there!

sql ATTACH DATABASE 'file:datasette_internal_core?mode=memory&cache=shared' AS core;

We'd run this on both the read-only connection and the write-only connection. That way, those tables would stay in memory, they'd communicate with the cache=shared feature, and we'd be good to go.

However, I couldn't find an easy way to run a ATTACH DATABASE command on the read-only query.

Using Database() as a wrapper for the internal DB is pretty limiting - it's meant for Datasette "data" databases, where we want multiple readers and possibly 1 write connection at a time. But the internal database doesn't really require that kind of support - I think we could get away with a single read/write connection, but it seemed like too big of a rabbithole to go through now.


:books: Documentation preview :books:: https://datasette--2162.org.readthedocs.build/en/2162/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2162/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1864112887 PR_kwDOBm6k_c5Yo7bk 2151 Test Datasette on multiple SQLite versions asg017 15178711 open 0     1 2023-08-23T22:42:51Z 2023-08-23T22:58:13Z   CONTRIBUTOR simonw/datasette/pulls/2151

still testing, hope it works!


:books: Documentation preview :books:: https://datasette--2151.org.readthedocs.build/en/2151/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2151/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
1  
1861812208 PR_kwDOBm6k_c5YhH-W 2149 Start a new `datasette.yaml` configuration file, with settings support asg017 15178711 closed 0     2 2023-08-22T16:24:16Z 2023-08-23T01:26:11Z 2023-08-23T01:26:11Z CONTRIBUTOR simonw/datasette/pulls/2149

refs #2093 #2143

This is the first step to implementing the new datasette.yaml/datasette.json configuration file.

  • The old --config argument is now back, and is the path to a datasette.yaml file. Acts like the --metadata flag.
  • The old settings.json behavior has been removed.
  • The "settings" key inside datasette.yaml defines the same --settings flags
  • Values passed in --settings will over-write values in datasette.yaml

Docs for the Config file is pretty light, not much to add until we add more config to the file.


:books: Documentation preview :books:: https://datasette--2149.org.readthedocs.build/en/2149/

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/2149/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1816917522 PR_kwDOCGYnMM5WJ6Jm 573 feat: Implement a prepare_connection plugin hook asg017 15178711 closed 0     4 2023-07-22T22:48:44Z 2023-07-22T22:59:09Z 2023-07-22T22:59:09Z CONTRIBUTOR simonw/sqlite-utils/pulls/573

Just like the Datasette prepare_connection hook, this PR adds a similar hook for the sqlite-utils plugin system.

The sole argument is conn, since I don't believe a database or datasette argument would be relevant here.

I want to do this so I can release sqlite-utils plugins for my SQLite extensions, similar to the Datasette plugins I've release for them.

An example plugin: https://gist.github.com/asg017/d7cdf0d56e2be87efda28cebee27fa3c

```bash $ sqlite-utils install https://gist.github.com/asg017/d7cdf0d56e2be87efda28cebee27fa3c/archive/5f5ad549a40860787629c69ca120a08c32519e99.zip

$ sqlite-utils memory 'select hello("alex") as response' [{"response": "Hello, alex!"}] ``` Refs: - #574


:books: Documentation preview :books:: https://sqlite-utils--573.org.readthedocs.build/en/573/

sqlite-utils 140912432 pull    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/573/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  
1344823170 PR_kwDOBm6k_c49e3_k 1789 Add new entrypoint option to `--load-extension` asg017 15178711 closed 0     9 2022-08-19T19:27:47Z 2022-08-23T18:42:52Z 2022-08-23T18:34:30Z CONTRIBUTOR simonw/datasette/pulls/1789

Closes #1784

The --load-extension flag can now accept an optional "entrypoint" value, to specify which entrypoint SQLite should load from the given extension.

```bash

would load default entrypoint like before

datasette data.db --load-extension ext

loads the extensions with the "sqlite3_foo_init" entrpoint

datasette data.db --load-extension ext:sqlite3_foo_init

loads the extensions with the "sqlite3_bar_init" entrpoint

datasette data.db --load-extension ext:sqlite3_bar_init ```

For testing, I added a small SQLite extension in C at tests/ext.c. If compiled, then pytest will run the unit tests in test_load_extensions.pyto verify that Datasette loads in extensions correctly (and loads the correct entrypoints). Compiling the extension requires a C compiler, I compiled it on my Mac with:

gcc ext.c -I path/to/sqlite -fPIC -shared -o ext.dylib

Where path/to/sqlite is a directory that contains the SQLite amalgamation header files.

Re documentation: I added a bit to the help text for --load-extension (which I believe should auto-add to documentation?), and the existing extension documentation is spatialite specific. Let me know if a new extensions documentation page would be helpful!

datasette 107914493 pull    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1789/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
0  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Queries took 51.705ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows