issue_comments
23 rows where issue = 775666296 and user = 9599 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- "datasette insert" command and plugin hook · 23 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
753568428 | https://github.com/simonw/datasette/issues/1160#issuecomment-753568428 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MzU2ODQyOA== | simonw 9599 | 2021-01-03T05:02:32Z | 2021-01-03T05:02:32Z | OWNER | Should this command include a I thought about doing that for But maybe I can set sensible defaults for that with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752275611 | https://github.com/simonw/datasette/issues/1160#issuecomment-752275611 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI3NTYxMQ== | simonw 9599 | 2020-12-29T23:32:04Z | 2020-12-29T23:32:04Z | OWNER | If I can get this working for CSV, TSV, JSON and JSON-NL that should be enough to exercise the API design pretty well across both streaming and non-streaming formats. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752274509 | https://github.com/simonw/datasette/issues/1160#issuecomment-752274509 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI3NDUwOQ== | simonw 9599 | 2020-12-29T23:26:02Z | 2020-12-29T23:26:02Z | OWNER | The documentation for this plugin hook is going to be pretty detailed, since it involves writing custom classes. I'll stick it all on the existing hooks page for the moment, but I should think about breaking up the plugin hook documentation into a page-per-hook in the future. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752274078 | https://github.com/simonw/datasette/issues/1160#issuecomment-752274078 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI3NDA3OA== | simonw 9599 | 2020-12-29T23:23:39Z | 2020-12-29T23:23:39Z | OWNER | If I design this right I can ship a full version of the command-line |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752273873 | https://github.com/simonw/datasette/issues/1160#issuecomment-752273873 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI3Mzg3Mw== | simonw 9599 | 2020-12-29T23:22:30Z | 2020-12-29T23:22:30Z | OWNER | How much of this should I get done in a branch before merging into The challenge here is the plugin hook design: ideally I don't want an incomplete plugin hook design in |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752273400 | https://github.com/simonw/datasette/issues/1160#issuecomment-752273400 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI3MzQwMA== | simonw 9599 | 2020-12-29T23:19:46Z | 2020-12-29T23:19:46Z | OWNER | I'm going to break out some separate tickets. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752273306 | https://github.com/simonw/datasette/issues/1160#issuecomment-752273306 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI3MzMwNg== | simonw 9599 | 2020-12-29T23:19:15Z | 2020-12-29T23:19:15Z | OWNER | It would be nice if this abstraction could support progress bars as well. These won't necessarily work for every format - or they might work for things loaded from files but not things loaded over URLs (if the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752267905 | https://github.com/simonw/datasette/issues/1160#issuecomment-752267905 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI2NzkwNQ== | simonw 9599 | 2020-12-29T22:52:09Z | 2020-12-29T22:52:09Z | OWNER | What's the simplest thing that could possible work? I think it's |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752266076 | https://github.com/simonw/datasette/issues/1160#issuecomment-752266076 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI2NjA3Ng== | simonw 9599 | 2020-12-29T22:42:23Z | 2020-12-29T22:42:59Z | OWNER | Aside: maybe This would be useful for import mechanisms that are likely to need their own custom set of command-line options unique to that source. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752265600 | https://github.com/simonw/datasette/issues/1160#issuecomment-752265600 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI2NTYwMA== | simonw 9599 | 2020-12-29T22:39:56Z | 2020-12-29T22:39:56Z | OWNER | Does it definitely make sense to break this operation up into the code that turns the incoming format into a iterator of dictionaries, then the code that inserts those into the database using That seems right for simple imports, where the incoming file represents a sequence of records in a single table. But what about more complex formats? What if a format needs to be represented as multiple tables? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752259345 | https://github.com/simonw/datasette/issues/1160#issuecomment-752259345 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI1OTM0NQ== | simonw 9599 | 2020-12-29T22:11:54Z | 2020-12-29T22:11:54Z | OWNER | Important detail from https://docs.python.org/3/library/csv.html#csv.reader
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752257666 | https://github.com/simonw/datasette/issues/1160#issuecomment-752257666 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjI1NzY2Ng== | simonw 9599 | 2020-12-29T22:09:18Z | 2020-12-29T22:09:18Z | OWNER | Figuring out the API designI want to be able to support different formats, and be able to parse them into tables either streaming or in one go depending on if the format supports that. Ideally I want to be able to pull the first 1,024 bytes for the purpose of detecting the format, then replay those bytes again later. I'm considering this a stretch goal though. CSV is easy to parse as a stream - here’s how sqlite-utils does it:
Problem: using |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752236520 | https://github.com/simonw/datasette/issues/1160#issuecomment-752236520 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjIzNjUyMA== | simonw 9599 | 2020-12-29T20:48:51Z | 2020-12-29T20:48:51Z | OWNER | It would be neat if |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
751925934 | https://github.com/simonw/datasette/issues/1160#issuecomment-751925934 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MTkyNTkzNA== | simonw 9599 | 2020-12-29T02:40:13Z | 2020-12-29T20:25:57Z | OWNER | Basic command design:
The options can include:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752208036 | https://github.com/simonw/datasette/issues/1160#issuecomment-752208036 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjIwODAzNg== | simonw 9599 | 2020-12-29T19:06:35Z | 2020-12-29T19:06:35Z | OWNER | If I'm going to execute 1000s of writes in an https://stackoverflow.com/a/36648102 and https://github.com/python/asyncio/issues/284 confirm that |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
752203909 | https://github.com/simonw/datasette/issues/1160#issuecomment-752203909 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MjIwMzkwOQ== | simonw 9599 | 2020-12-29T18:54:19Z | 2020-12-29T18:54:19Z | OWNER | More thoughts on this: the key mechanism that populates the tables needs to be an |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
751947991 | https://github.com/simonw/datasette/issues/1160#issuecomment-751947991 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MTk0Nzk5MQ== | simonw 9599 | 2020-12-29T05:06:50Z | 2020-12-29T05:07:03Z | OWNER | Given the URL option could it be possible for plugins to "subscribe" to URLs that keep on streaming?
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
751946262 | https://github.com/simonw/datasette/issues/1160#issuecomment-751946262 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MTk0NjI2Mg== | simonw 9599 | 2020-12-29T04:56:12Z | 2020-12-29T04:56:32Z | OWNER | Potential design for this: a
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
751945094 | https://github.com/simonw/datasette/issues/1160#issuecomment-751945094 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MTk0NTA5NA== | simonw 9599 | 2020-12-29T04:48:11Z | 2020-12-29T04:48:11Z | OWNER | It would be pretty cool if you could launch Datasette directly against an insert-compatible file or URL without first having to load it into a SQLite database file. Or imagine being able to tail a log file and like that directly into a new Datasette process, which then runs a web server with the UI while simultaneously continuing to load new entries from that log into the in-memory SQLite database that it is serving... Not quite sure what that CLI interface would look like. Maybe treat that as a future stretch goal for the moment. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
751943837 | https://github.com/simonw/datasette/issues/1160#issuecomment-751943837 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MTk0MzgzNw== | simonw 9599 | 2020-12-29T04:40:30Z | 2020-12-29T04:40:30Z | OWNER | The It should accept more than one file name at a time for bulk inserts. if using a URL that URL will be passed to the method that decides if a plugin implementation can handle the import or not. This will allow plugins to register themselves for specific websites. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
751926437 | https://github.com/simonw/datasette/issues/1160#issuecomment-751926437 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MTkyNjQzNw== | simonw 9599 | 2020-12-29T02:43:21Z | 2020-12-29T02:43:37Z | OWNER | Default formats to support:
Each of these will be implemented as a default plugin. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
751926218 | https://github.com/simonw/datasette/issues/1160#issuecomment-751926218 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MTkyNjIxOA== | simonw 9599 | 2020-12-29T02:41:57Z | 2020-12-29T02:41:57Z | OWNER | Other names I considered:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 | |
751926095 | https://github.com/simonw/datasette/issues/1160#issuecomment-751926095 | https://api.github.com/repos/simonw/datasette/issues/1160 | MDEyOklzc3VlQ29tbWVudDc1MTkyNjA5NQ== | simonw 9599 | 2020-12-29T02:41:15Z | 2020-12-29T02:41:15Z | OWNER | The UI can live at
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"datasette insert" command and plugin hook 775666296 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1