issues
46 rows where repo = 206156866 and user = 9599 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
520508502 | MDU6SXNzdWU1MjA1MDg1MDI= | 31 | "friends" command (similar to "followers") | simonw 9599 | closed | 0 | 2 | 2019-11-09T20:20:20Z | 2022-09-20T05:05:03Z | 2020-02-07T07:03:28Z | MEMBER | Current list of commands:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
703218448 | MDU6SXNzdWU3MDMyMTg0NDg= | 51 | Documentation for twitter-to-sqlite fetch | simonw 9599 | open | 0 | 0 | 2020-09-17T02:38:10Z | 2020-09-17T02:38:10Z | MEMBER | It's mentioned in passing in the README but it deserves its own section:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/51/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
663976976 | MDU6SXNzdWU2NjM5NzY5NzY= | 48 | Add a table of contents to the README | simonw 9599 | closed | 0 | 3 | 2020-07-22T18:54:33Z | 2020-07-23T17:46:07Z | 2020-07-22T19:03:02Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
610284471 | MDU6SXNzdWU2MTAyODQ0NzE= | 46 | Error running 'search' for the first time | simonw 9599 | closed | 0 | 0 | 2020-04-30T18:11:20Z | 2020-04-30T18:11:58Z | 2020-04-30T18:11:58Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602619330 | MDU6SXNzdWU2MDI2MTkzMzA= | 45 | Use raise_for_status() everywhere | simonw 9599 | open | 0 | 1 | 2020-04-19T04:38:28Z | 2020-04-19T04:39:22Z | MEMBER | I keep seeing errors which I think are caused by authentication or rate limit problems but which appear to be unexpected JSON responses - presumably because they are actually an error message. Recent example: https://github.com/simonw/jsk-fellows-on-twitter/runs/598892575 Using |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/45/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
602176870 | MDU6SXNzdWU2MDIxNzY4NzA= | 43 | "twitter-to-sqlite lists" command for retrieving a user's owned lists | simonw 9599 | closed | 0 | 1 | 2020-04-17T19:08:59Z | 2020-04-17T23:48:28Z | 2020-04-17T23:30:39Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
585353598 | MDU6SXNzdWU1ODUzNTM1OTg= | 37 | Handle "User not found" error | simonw 9599 | closed | 0 | 3 | 2020-03-20T22:14:32Z | 2020-04-17T23:43:46Z | 2020-04-17T23:43:46Z | MEMBER | While running
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602173589 | MDU6SXNzdWU2MDIxNzM1ODk= | 42 | Error running user-timeline with --sql and --ids together | simonw 9599 | closed | 0 | 0 | 2020-04-17T19:02:06Z | 2020-04-17T23:34:40Z | 2020-04-17T23:34:40Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602181581 | MDU6SXNzdWU2MDIxODE1ODE= | 44 | tweet["source"] can be an empty string | simonw 9599 | closed | 0 | 0 | 2020-04-17T19:18:26Z | 2020-04-17T22:01:44Z | 2020-04-17T22:01:44Z | MEMBER | Got this excepion:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
591613579 | MDU6SXNzdWU1OTE2MTM1Nzk= | 41 | Bug: recorded a since_id for None, None | simonw 9599 | closed | 0 | 0 | 2020-04-01T04:29:43Z | 2020-04-01T04:31:11Z | 2020-04-01T04:31:11Z | MEMBER | This shouldn't happen in the |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
590669793 | MDU6SXNzdWU1OTA2Njk3OTM= | 40 | Feature: record history of follower counts | simonw 9599 | closed | 0 | 5 | 2020-03-30T23:32:28Z | 2020-04-01T04:13:05Z | 2020-04-01T04:13:05Z | MEMBER | We currently over-write the follower count every time we import a tweet (when we import that user profile again): It would be neat if we noticed if that user's follower count (and maybe other counts?) had changed since we last saved them and recorded that change in a separate history table. This would be an inexpensive way of building up rough charts of follower count over time. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
492297930 | MDU6SXNzdWU0OTIyOTc5MzA= | 10 | Rethink progress bars for various commands | simonw 9599 | closed | 0 | 5 | 2019-09-11T15:06:47Z | 2020-04-01T03:45:48Z | 2020-04-01T03:45:48Z | MEMBER | Progress bars and the This is made more challenging by the fact that for many operations the total length is not known. https://click.palletsprojects.com/en/7.x/api/#click.progressbar |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
590666760 | MDU6SXNzdWU1OTA2NjY3NjA= | 39 | --since feature can be confused by retweets | simonw 9599 | closed | 0 | 11 | 2020-03-30T23:25:33Z | 2020-04-01T03:45:16Z | 2020-04-01T03:45:16Z | MEMBER | If you run It does this by seeking out the max ID of their previous tweets: BUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets! |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
490803176 | MDU6SXNzdWU0OTA4MDMxNzY= | 8 | --sql and --attach options for feeding commands from SQL queries | simonw 9599 | closed | 0 | 4 | 2019-09-08T20:35:49Z | 2020-03-20T23:13:01Z | 2020-03-20T23:13:01Z | MEMBER | Say you want to fetch Twitter profiles for a list of accounts that are stored in another database:
The SQL query you feed in is expected to return a list of screen names suitable for processing further by the command. Should be supported by all three of:
The |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
585306847 | MDU6SXNzdWU1ODUzMDY4NDc= | 36 | twitter-to-sqlite followers/friends --sql / --attach | simonw 9599 | closed | 0 | 0 | 2020-03-20T20:20:33Z | 2020-03-20T23:12:38Z | 2020-03-20T23:12:38Z | MEMBER | Split from #8. The ( |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/36/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
585359363 | MDU6SXNzdWU1ODUzNTkzNjM= | 38 | Screen name display for user-timeline is uneven | simonw 9599 | closed | 0 | 1 | 2020-03-20T22:30:23Z | 2020-03-20T22:37:17Z | 2020-03-20T22:37:17Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/38/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
585282212 | MDU6SXNzdWU1ODUyODIyMTI= | 35 | twitter-to-sqlite user-timeline [screen_names] --sql / --attach | simonw 9599 | closed | 0 | 5 | 2020-03-20T19:26:07Z | 2020-03-20T20:17:00Z | 2020-03-20T20:16:35Z | MEMBER | Split from #8. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/35/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
561469252 | MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4 | 33 | Upgrade to sqlite-utils 2.2.1 | simonw 9599 | closed | 0 | 1 | 2020-02-07T07:32:12Z | 2020-03-20T19:21:42Z | 2020-03-20T19:21:41Z | MEMBER | dogsheep/twitter-to-sqlite/pulls/33 | twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/33/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
585266763 | MDU6SXNzdWU1ODUyNjY3NjM= | 34 | IndexError running user-timeline command | simonw 9599 | closed | 0 | 2 | 2020-03-20T18:54:08Z | 2020-03-20T19:20:52Z | 2020-03-20T19:20:37Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
561454071 | MDU6SXNzdWU1NjE0NTQwNzE= | 32 | Documentation for " favorites" command | simonw 9599 | closed | 0 | 0 | 2020-02-07T06:50:11Z | 2020-02-07T06:59:10Z | 2020-02-07T06:59:10Z | MEMBER | It looks like I forgot to document this one in the README. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/32/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
515658861 | MDU6SXNzdWU1MTU2NTg4NjE= | 28 | Add indexes to followers table | simonw 9599 | closed | 0 | 1 | 2019-10-31T18:40:22Z | 2019-11-09T20:15:42Z | 2019-11-09T20:11:48Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488833975 | MDU6SXNzdWU0ODg4MzM5NzU= | 3 | Command for running a search and saving tweets for that search | simonw 9599 | closed | 0 | 6 | 2019-09-03T21:29:56Z | 2019-11-04T05:31:56Z | 2019-11-04T05:31:16Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
514459062 | MDU6SXNzdWU1MTQ0NTkwNjI= | 27 | retweets-of-me command | simonw 9599 | closed | 0 | 4 | 2019-10-30T07:43:01Z | 2019-11-03T01:12:58Z | 2019-11-03T01:12:58Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
513074501 | MDU6SXNzdWU1MTMwNzQ1MDE= | 26 | Command for importing mentions timeline | simonw 9599 | closed | 0 | 1 | 2019-10-28T03:14:27Z | 2019-10-30T02:36:13Z | 2019-10-30T02:20:47Z | MEMBER | https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-mentions_timeline Almost identical to home-timeline #18 but it uses |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
506268945 | MDU6SXNzdWU1MDYyNjg5NDU= | 20 | --since support for various commands for refresh-by-cron | simonw 9599 | closed | 0 | 3 | 2019-10-13T03:40:46Z | 2019-10-21T03:32:04Z | 2019-10-16T19:26:11Z | MEMBER | I want to run a cron that updates my Twitter database every X minutes. It should be able to retrieve the following without needing to paginate through everything:
It would be nice if this could be standardized across all commands as a |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
508190730 | MDU6SXNzdWU1MDgxOTA3MzA= | 23 | Extremely simple migration system | simonw 9599 | closed | 0 | 2 | 2019-10-17T02:13:57Z | 2019-10-17T16:57:17Z | 2019-10-17T16:57:17Z | MEMBER | Needed for #12. This is going to be an incredibly simple version of the Django migration system.
The function names will be detected and used as the names of the migrations. Every time you run the CLI tool it will call the Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
508578780 | MDU6SXNzdWU1MDg1Nzg3ODA= | 25 | Ensure migrations don't accidentally create foreign key twice | simonw 9599 | closed | 0 | 2 | 2019-10-17T16:08:50Z | 2019-10-17T16:56:47Z | 2019-10-17T16:56:47Z | MEMBER | Is it possible for these lines to run against a database table that already has these foreign keys? |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
508553387 | MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4 | 24 | Tweet source extraction and new migration system | simonw 9599 | closed | 0 | 0 | 2019-10-17T15:24:56Z | 2019-10-17T15:49:29Z | 2019-10-17T15:49:24Z | MEMBER | dogsheep/twitter-to-sqlite/pulls/24 | Closes #12 and #23 |
twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/24/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||
503053800 | MDU6SXNzdWU1MDMwNTM4MDA= | 12 | Extract "source" into a separate lookup table | simonw 9599 | closed | 0 | 3 | 2019-10-06T05:17:23Z | 2019-10-17T15:49:24Z | 2019-10-17T15:49:24Z | MEMBER | It's pretty bulky and ugly at the moment: |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
506087267 | MDU6SXNzdWU1MDYwODcyNjc= | 19 | since_id support for home-timeline | simonw 9599 | closed | 0 | 3 | 2019-10-11T22:48:24Z | 2019-10-16T19:13:06Z | 2019-10-16T19:12:46Z | MEMBER | Currently every time you run |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
508024032 | MDU6SXNzdWU1MDgwMjQwMzI= | 22 | Ability to import from uncompressed archive or from specific files | simonw 9599 | closed | 0 | 0 | 2019-10-16T18:31:57Z | 2019-10-16T18:53:36Z | 2019-10-16T18:53:36Z | MEMBER | Currently you can only import like this:
It would be useful if you could import from a folder that was decompressed from that zip:
AND from individual files within that folder - since that would allow you to e.g. selectively import certain files:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/22/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
506432572 | MDU6SXNzdWU1MDY0MzI1NzI= | 21 | Fix & escapes in tweet text | simonw 9599 | closed | 0 | 1 | 2019-10-14T03:37:28Z | 2019-10-15T18:48:16Z | 2019-10-15T18:48:16Z | MEMBER | Shouldn't be storing |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
503244410 | MDU6SXNzdWU1MDMyNDQ0MTA= | 14 | When importing favorites, record which user favorited them | simonw 9599 | closed | 0 | 0 | 2019-10-07T05:45:11Z | 2019-10-14T03:30:25Z | 2019-10-14T03:30:25Z | MEMBER | This code currently just dumps them into the |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/14/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
505928530 | MDU6SXNzdWU1MDU5Mjg1MzA= | 18 | Command to import home-timeline | simonw 9599 | closed | 0 | 4 | 2019-10-11T15:47:54Z | 2019-10-11T16:51:33Z | 2019-10-11T16:51:12Z | MEMBER | Feature request: https://twitter.com/johankj/status/1182563563136868352
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
505674949 | MDU6SXNzdWU1MDU2NzQ5NDk= | 17 | import command should empty all archive-* tables first | simonw 9599 | closed | 0 | 2 | 2019-10-11T06:58:43Z | 2019-10-11T15:40:08Z | 2019-10-11T15:40:08Z | MEMBER | Can have a CLI option for NOT doing that. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
505673645 | MDU6SXNzdWU1MDU2NzM2NDU= | 16 | Do a better job with archived direct message threads | simonw 9599 | open | 0 | 0 | 2019-10-11T06:55:21Z | 2019-10-11T06:55:27Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/16/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||||
488835586 | MDU6SXNzdWU0ODg4MzU1ODY= | 4 | Command for importing data from a Twitter Export file | simonw 9599 | closed | 0 | 2 | 2019-09-03T21:34:13Z | 2019-10-11T06:45:02Z | 2019-10-11T06:45:02Z | MEMBER | Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data A command for importing this data into SQLite would be extremely useful.
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
505666744 | MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz | 15 | twitter-to-sqlite import command, refs #4 | simonw 9599 | closed | 0 | 0 | 2019-10-11T06:37:14Z | 2019-10-11T06:45:01Z | 2019-10-11T06:45:01Z | MEMBER | dogsheep/twitter-to-sqlite/pulls/15 | twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/15/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
503085013 | MDU6SXNzdWU1MDMwODUwMTM= | 13 | statuses-lookup command | simonw 9599 | closed | 0 | 1 | 2019-10-06T11:00:20Z | 2019-10-07T00:33:49Z | 2019-10-07T00:31:44Z | MEMBER | For bulk retrieving tweets by their ID. https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-lookup Rate limit is 900/15 minutes (1 call per second) but each call can pull up to 100 IDs, so we can pull 6,000 per minute. Should support |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
503045221 | MDU6SXNzdWU1MDMwNDUyMjE= | 11 | Commands for recording real-time tweets from the streaming API | simonw 9599 | closed | 0 | 1 | 2019-10-06T03:09:30Z | 2019-10-06T04:54:17Z | 2019-10-06T04:48:31Z | MEMBER | https://developer.twitter.com/en/docs/tweets/filter-realtime/api-reference/post-statuses-filter We can support tracking keywords and following specific users. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
491791152 | MDU6SXNzdWU0OTE3OTExNTI= | 9 | followers-ids and friends-ids subcommands | simonw 9599 | closed | 0 | 1 | 2019-09-10T16:58:15Z | 2019-09-10T17:36:55Z | 2019-09-10T17:36:55Z | MEMBER | These will import follower and friendship IDs into the following tables, using these APIs: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-followers-ids https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-ids |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
490798130 | MDU6SXNzdWU0OTA3OTgxMzA= | 7 | users-lookup command for fetching users | simonw 9599 | closed | 0 | 0 | 2019-09-08T19:47:59Z | 2019-09-08T20:32:13Z | 2019-09-08T20:32:13Z | MEMBER | https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
489419782 | MDU6SXNzdWU0ODk0MTk3ODI= | 6 | Extract extended_entities into a media table | simonw 9599 | closed | 0 | 0 | 2019-09-04T21:59:10Z | 2019-09-04T22:08:01Z | 2019-09-04T22:08:01Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
488833136 | MDU6SXNzdWU0ODg4MzMxMzY= | 1 | Imported followers should go in "users", relationships in "following" | simonw 9599 | closed | 0 | 0 | 2019-09-03T21:27:37Z | 2019-09-04T20:23:04Z | 2019-09-04T20:23:04Z | MEMBER | Right now It should instead save them all in a global |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488833698 | MDU6SXNzdWU0ODg4MzM2OTg= | 2 | "twitter-to-sqlite user-timeline" command for pulling tweets by a specific user | simonw 9599 | closed | 0 | 3 | 2019-09-03T21:29:12Z | 2019-09-04T20:02:11Z | 2019-09-04T20:02:11Z | MEMBER | Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html I'm going to do:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488874815 | MDU6SXNzdWU0ODg4NzQ4MTU= | 5 | Write tests that simulate the Twitter API | simonw 9599 | open | 0 | 1 | 2019-09-03T23:55:35Z | 2019-09-03T23:56:28Z | MEMBER | I can use betamax for this: https://pypi.org/project/betamax/ |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);