home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

60 rows where repo = 206156866 and type = "issue" sorted by updated_at descending

✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: user, comments, author_association, created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 47
  • open 13

type 1

  • issue · 60 ✖

repo 1

  • twitter-to-sqlite · 60 ✖
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
1816830546 I_kwDODEm0Qs5sSqJS 73 Twitter v1 API shutdown david-perez 6341745 open 0     0 2023-07-22T16:57:41Z 2023-07-22T16:57:41Z   NONE  

I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get:

[2023-07-19 21:00:04.937536] File "/home/pi/code/liked-tweets/lib/python3.7/site-packages/twitter_to_sqlite/utils.py", line 202, in fetch_timeline [2023-07-19 21:00:04.937606] raise Exception(str(tweets["errors"])) [2023-07-19 21:00:04.937678] Exception: [{'message': 'You currently have access to a subset of Twitter API v2 endpoints and limited v1.1 endpoints (e.g. media post, oauth) only. If you need access to this endpoint, you may need a different access level. You can learn more here: https://developer.twitter.com/en/portal/product', 'code': 453}]

It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they announced they'd be deprecated on 29th April.

Unfortunately retrieving likes using the v2 API is not part of their free plan. In fact, with the free plan one can only post and delete tweets and retrieve information about oneself.

So I'm afraid this is the end of this very nice project. It was very useful, thank you!

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions",
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 1
}
   
1524431805 I_kwDODEm0Qs5a3Pu9 72 Import thread, including self- and others' replies mcint 601708 open 0     0 2023-01-08T09:51:06Z 2023-01-08T09:51:06Z   NONE  

statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this.

twitter-to-sqlite fetch-thread tw-group1.db 1234123412341234

twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity.

For reference, this is implemented in twarc, using a search, optionally recursively.

Other research suggests that this formerly, or currently, requires a search query, use of undocumented related_results api, or with requested inclusion of newer conversation_id with subsequent query.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
779088071 MDU6SXNzdWU3NzkwODgwNzE= 54 Archive import appears to be broken on recent exports jacobian 21148 open 0     5 2021-01-05T14:18:01Z 2023-01-04T11:06:55Z   CONTRIBUTOR  

I requested a Twitter export yesterday, and unfortunately they seem to have changed it such that twitter-to-sqlite import can't handle it anymore 😢

So far I've ran into two issues. The first was easy to work around, but the second will take more investigation. If I can find the time I'll keep working on it and update this issue accordingly.

The issues (so far):

1. Data seems to have moved to a data/ subdirectory

Running twitter-to-sqlite import on the raw zip file reports a bunch of "not yet implemented" errors, and then exits without actually importing anything:

❯ twitter-to-sqlite import tarchive.db twitter.zip ... data/manifest: not yet implemented data/account-creation-ip: not yet implemented data/account-suspension: not yet implemented ... (dozens of more lines like this, including critical stuff like data/tweets) ...

(tarchive.db now exists, but is empty)

Workaround: unpack the zip file, and run twitter-to-sqlite import tarchive.db path/to/archive/data

That gets further, but:

2. Some schema(s?) have changed

At least, the blocks schema seems different now:

❯ twitter-to-sqlite import tarchive.db archive/data direct-messages-group: not yet implemented branch-links: not yet implemented periscope-expired-broadcasts: not yet implemented direct-messages: not yet implemented mute: not yet implemented Traceback (most recent call last): File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/bin/twitter-to-sqlite", line 8, in <module> sys.exit(cli()) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/cli.py", line 772, in import_ archive.import_from_file(db, filepath.name, open(filepath, "rb").read()) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/archive.py", line 215, in import_from_file to_insert = transformer(data) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/archive.py", line 115, in lists_member return {"lists-member": _list_from_common(data)} File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/archive.py", line 200, in _list_from_common for url in block["userListInfo"]["urls"]: KeyError: 'urls'

That's as far as I got before I needed to work on something else. I'll report back if I get further!

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1077560091 I_kwDODEm0Qs5AOkMb 61 Data Pull fails for "Essential" level access to the Twitter API (for Documentation) jmnickerson05 57161638 open 0     1 2021-12-11T14:59:41Z 2022-10-31T14:47:58Z   NONE  

Per Twitter documentation: https://developer.twitter.com/en/docs/twitter-api/getting-started/about-twitter-api#v2-access-leve

This isn't any fault of twitter-to-sqlite of course, but it should probably be documented as a side-note.

And this is how I'm surfacing the message from utils.py:

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/61/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1063982712 I_kwDODEm0Qs4_axZ4 60 Execution on Windows bernard01 1733616 open 0     1 2021-11-26T00:24:34Z 2022-10-14T16:58:27Z   NONE  

My installation on Windows using pip has been successful. I have Python 3.6.

How do I run twitter-to-sqlite? I cannot even figure out how "auth" is a command. I have python on my path: C:\prog\python\Python36;C:\prog\python\Python36\Scripts

Where should the commands be executed, and where are the files created?

Could some basics please be added to the documentation to get beginners started?

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/60/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
520508502 MDU6SXNzdWU1MjA1MDg1MDI= 31 "friends" command (similar to "followers") simonw 9599 closed 0     2 2019-11-09T20:20:20Z 2022-09-20T05:05:03Z 2020-02-07T07:03:28Z MEMBER  

Current list of commands: followers Save followers for specified user (defaults to... followers-ids Populate followers table with IDs of account followers friends-ids Populate followers table with IDs of account friends Obvious omission here is friends, which would be powered by https://api.twitter.com/1.1/friends/list.json: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1088816961 I_kwDODEm0Qs5A5gdB 62 KeyError: 'created_at' for private accounts? swyxio 6764957 closed 0     2 2021-12-26T17:51:51Z 2022-03-12T02:36:32Z 2022-02-24T18:10:18Z NONE  

hey Simon!

i was running twitter-to-sqlite user-timeline twitter.db for my private alt and ran into this error:

![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png) ```bash Traceback (most recent call last): File "/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite", line 8, in <module> sys.exit(cli()) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1128, in __call__ return self.main(*args, **kwargs) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1053, in main rv = self.invoke(ctx) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 754, in invoke return __callback(*args, **kwargs) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 133, in get_profile save_users(db, [profile]) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 453, in save_users transform_user(user) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 285, in transform_user user["created_at"] = parser.parse(user["created_at"]) KeyError: 'created_at' ```

this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1097332098 I_kwDODEm0Qs5BZ_WC 64 Include all entities for tweets max 111631 open 0     0 2022-01-09T23:35:28Z 2022-01-09T23:35:28Z   NONE  

Per our conversation on Twitter:

It would be neat if all entities (including URLs) were captured. This way you can ensure, that URLs are parsed out exactly the same way Twitter parses URLs – we all know parsing URLs with a regex ain't fun.

Right now, I believe the tool filters out all entities that are not of type media.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/64/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1091850530 I_kwDODEm0Qs5BFFEi 63 Import archive error 'withheld_in_countries' pauloxnet 521097 open 0     0 2022-01-01T16:58:59Z 2022-01-01T16:58:59Z   NONE  

Importing the twitter archive I received this error: bash $ twitter-to-sqlite import archive.db twitter-2021-12-31-<hash>.zip birdwatch-note-rating: not yet implemented birdwatch-note: not yet implemented branch-links: not yet implemented community-tweet: not yet implemented contact: not yet implemented device-token: not yet implemented direct-message-mute: not yet implemented mute: not yet implemented periscope-account-information: not yet implemented periscope-ban-information: not yet implemented periscope-broadcast-metadata: not yet implemented periscope-comments-made-by-user: not yet implemented periscope-expired-broadcasts: not yet implemented periscope-followers: not yet implemented periscope-profile-description: not yet implemented professional-data: not yet implemented protected-history: not yet implemented reply-prompt: not yet implemented screen-name-change: not yet implemented smartblock: not yet implemented spaces-metadata: not yet implemented sso: not yet implemented Traceback (most recent call last): File "/home/paulox/.virtualenvs/dogsheep/bin/twitter-to-sqlite", line 8, in <module> sys.exit(cli()) File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py", line 1128, in __call__ return self.main(*args, **kwargs) File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py", line 1053, in main rv = self.invoke(ctx) File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py", line 754, in invoke return __callback(*args, **kwargs) File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/cli.py", line 759, in import_ archive.import_from_file(db, filename, content) File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/archive.py", line 246, in import_from_file db[table_name].insert_all(rows, pk=pk, replace=True) File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py", line 2625, in insert_all self.insert_chunk( File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py", line 2406, in insert_chunk result = self.db.execute(query, params) File "/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py", line 422, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table archive_tweet has no column named withheld_in_countries

I found only a single tweet with the key withheld_in_countries in tweet.js that seems the problems: JSON [ { "tweet" : { "retweeted" : false, "source" : "<a href=\"http://twitter.com/download/android\" rel=\"nofollow\">Twitter for Android</a>", "entities" : { "hashtags" : [ { "text" : "NowOnAndroid", "indices" : [ "64", "77" ] } ], "symbols" : [ ], "user_mentions" : [ { "name" : "Periscope", "screen_name" : "PeriscopeCo", "indices" : [ "3", "15" ], "id_str" : "1111111111", "id" : "222222222" } ], "urls" : [ { "url" : "https://t.co/xxxxxxxxx", "expanded_url" : "https://vine.co/v/xxxxxxxxx", "display_url" : "vine.co/v/xxxxxxxxxx", "indices" : [ "78", "101" ] } ] }, "display_text_range" : [ "0", "101" ], "favorite_count" : "0", "id_str" : "1111111111111111111111", "truncated" : false, "retweet_count" : "0", "withheld_in_countries" : [ "TR" ], "id" : "000000000000000000", "possibly_sensitive" : false, "created_at" : "Fri Aug 14 06:04:03 +0000 2015", "favorited" : false, "full_text" : "RT @periscopeco: Travel the world. LIVE. The Global Map is here #NowOnAndroid https://t.co/NZXdsPWROk", "lang" : "en" } } ]

I solved the error removing the key from the tweet.js but I'm reporting this error to improve the project.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/63/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
984939366 MDU6SXNzdWU5ODQ5MzkzNjY= 58 Error: Use either --since or --since_id, not both - still broken rubenv 42904 closed 0     1 2021-09-01T09:45:28Z 2021-09-21T17:37:41Z 2021-09-21T17:37:41Z CONTRIBUTOR  

Hi Simon,

It appears the fix for #57 doesn't fix things for me:

$ twitter-to-sqlite --version twitter-to-sqlite, version 0.21.4 $ python --version Python 3.9.6

$ twitter-to-sqlite home-timeline -a twitter-auth.json twitter/timeline.db --since Importing tweets Error: Use either --since or --since_id, not both

Is there any way I can help debug this?

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/58/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
907645813 MDU6SXNzdWU5MDc2NDU4MTM= 57 Error: Use either --since or --since_id, not both rubenv 42904 closed 0     6 2021-05-31T18:11:04Z 2021-08-20T00:01:31Z 2021-08-20T00:01:31Z CONTRIBUTOR  

I'm using the following command:

twitter-to-sqlite user-timeline -a twitter-auth.json twitter/tweets.db --since

Which gives the following error: Error: Use either --since or --since_id, not both

Running without --since.

Traceback (most recent call last): File "/usr/local/bin/twitter-to-sqlite", line 8, in <module> sys.exit(cli()) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1137, in __call__ return self.main(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1062, in main rv = self.invoke(ctx) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File "/usr/local/lib/python3.9/site-packages/click/core.py", line 763, in invoke return __callback(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/cli.py", line 317, in user_timeline for tweet in bar: File "/usr/local/lib/python3.9/site-packages/click/_termui_impl.py", line 328, in generator for rv in self.iter: File "/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 234, in fetch_user_timeline yield from fetch_timeline( File "/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 202, in fetch_timeline raise Exception(str(tweets["errors"])) Exception: [{'code': 44, 'message': 'since_id parameter is invalid.'}]

Python 3.9.5 twitter-to-sqlite, version 0.21.3

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57/reactions",
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
796736607 MDU6SXNzdWU3OTY3MzY2MDc= 56 Not all quoted statuses get fetched? gsajko 42315895 closed 0     3 2021-01-29T09:48:44Z 2021-02-03T10:36:36Z 2021-02-03T10:36:36Z NONE  

In my database I have 13300 quote tweets, but eta 3600 have quoted_status empty.

I fetched some of them using https://api.twitter.com/1.1/statuses/show.json?id=xx and they did have ids of quoted tweets.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
771324837 MDU6SXNzdWU3NzEzMjQ4Mzc= 53 --since support for favorites anotherjesse 27 closed 0     1 2020-12-19T07:08:23Z 2020-12-19T07:47:11Z 2020-12-19T07:47:11Z NONE  

Having support for --since for updating your favorites would be ideal as the api is both slow and it only returns ~3k most recent favorites.

https://twittercommunity.com/t/cant-get-all-favorite-tweets-by-rest-api/22007/3

The api seems to take an optional since_id parameter - https://developer.twitter.com/en/docs/twitter-api/v1/tweets/post-and-engage/api-reference/get-favorites-list

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
745393298 MDU6SXNzdWU3NDUzOTMyOTg= 52 Discussion: Adding support for fetching only fresh tweets fatihky 4169772 closed 0     1 2020-11-18T07:01:48Z 2020-11-18T07:12:45Z 2020-11-18T07:12:45Z NONE  

I think it'd be very useful if this tool has an option like --incremental to fetch only newer tweets. This way operations could complete very fast in sequential runs. I'd want to try to implement this feature if it seems OK for this tool's purpose.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/52/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
703218448 MDU6SXNzdWU3MDMyMTg0NDg= 51 Documentation for twitter-to-sqlite fetch simonw 9599 open 0     0 2020-09-17T02:38:10Z 2020-09-17T02:38:10Z   MEMBER  

It's mentioned in passing in the README but it deserves its own section: $ twitter-to-sqlite fetch \ "https://api.twitter.com/1.1/account/verify_credentials.json" \ | grep '"id"' | head -n 1

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/51/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
698791218 MDU6SXNzdWU2OTg3OTEyMTg= 50 favorites --stop_after=N stops after min(N, 200) mikepqr 370930 open 0     2 2020-09-11T03:38:14Z 2020-09-13T05:11:14Z   CONTRIBUTOR  

For any number greater than 200, favorites --stop_after stops after getting 200 tweets, e.g. $ twitter-to-sqlite favorites tweets.db --stop_after=300 Importing favorites [####################################] 199 $ I don't think this is a limitation of the API (if you omit --stop_after you get some very large number, possibly all of them), so I think this is a bug.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/50/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
663976976 MDU6SXNzdWU2NjM5NzY5NzY= 48 Add a table of contents to the README simonw 9599 closed 0     3 2020-07-22T18:54:33Z 2020-07-23T17:46:07Z 2020-07-22T19:03:02Z MEMBER  

Using https://github.com/jonschlinkert/markdown-toc

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
639542974 MDU6SXNzdWU2Mzk1NDI5NzQ= 47 Fall back to FTS4 if FTS5 is not available hpk42 73579 open 0     3 2020-06-16T10:11:23Z 2020-06-17T20:13:48Z   NONE  

got this with version 0.21.1 from pypi. twitter-to-sqlite auth worked but then "twitter-to-sqlite user-timeline USER.db" produced a tracekback ending in "no such module: FTS5".

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
610284471 MDU6SXNzdWU2MTAyODQ0NzE= 46 Error running 'search' for the first time simonw 9599 closed 0     0 2020-04-30T18:11:20Z 2020-04-30T18:11:58Z 2020-04-30T18:11:58Z MEMBER  

% twitter-to-sqlite search infodemic.db '#infodemic' Traceback (most recent call last): File "/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/bin/twitter-to-sqlite", line 11, in <module> load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File "/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py", line 867, in search for tweet in tweets: File "/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py", line 165, in fetch_timeline [since_type_id, since_key], sqlite3.OperationalError: no such table: since_ids

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
602619330 MDU6SXNzdWU2MDI2MTkzMzA= 45 Use raise_for_status() everywhere simonw 9599 open 0     1 2020-04-19T04:38:28Z 2020-04-19T04:39:22Z   MEMBER  

I keep seeing errors which I think are caused by authentication or rate limit problems but which appear to be unexpected JSON responses - presumably because they are actually an error message.

Recent example: https://github.com/simonw/jsk-fellows-on-twitter/runs/598892575

Using response.raise_for_status() everywhere will make these errors less confusing.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/45/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
602176870 MDU6SXNzdWU2MDIxNzY4NzA= 43 "twitter-to-sqlite lists" command for retrieving a user's owned lists simonw 9599 closed 0     1 2020-04-17T19:08:59Z 2020-04-17T23:48:28Z 2020-04-17T23:30:39Z MEMBER  

https://developer.twitter.com/en/docs/accounts-and-users/create-manage-lists/api-reference/get-lists-ownerships

https://api.twitter.com/1.1/lists/ownerships.json

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
585353598 MDU6SXNzdWU1ODUzNTM1OTg= 37 Handle "User not found" error simonw 9599 closed 0     3 2020-03-20T22:14:32Z 2020-04-17T23:43:46Z 2020-04-17T23:43:46Z MEMBER  

While running user-timeline I got this bug (because a screen name I asked for didn't exist): ``` File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py", line 185, in transform_user user["created_at"] = parser.parse(user["created_at"]) KeyError: 'created_at'

import pdb pdb.pm() /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(185)transform_user() -> user["created_at"] = parser.parse(user["created_at"]) (Pdb) user {'errors': [{'code': 50, 'message': 'User not found.'}]} ```

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
602173589 MDU6SXNzdWU2MDIxNzM1ODk= 42 Error running user-timeline with --sql and --ids together simonw 9599 closed 0     0 2020-04-17T19:02:06Z 2020-04-17T23:34:40Z 2020-04-17T23:34:40Z MEMBER  

$ twitter-to-sqlite user-timeline tweets.db --sql='select id from users' --ids Traceback (most recent call last): File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite", line 11, in <module> load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py", line 284, in user_timeline "@{:" + str(max(len(identifier) for identifier in identifiers)) + "}" File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py", line 284, in <genexpr> "@{:" + str(max(len(identifier) for identifier in identifiers)) + "}" TypeError: object of type 'int' has no len() But this DID work - casting to strings: $ twitter-to-sqlite user-timeline tweets.db --sql='select "" || id from users' --ids ... this worked ...

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
602181581 MDU6SXNzdWU2MDIxODE1ODE= 44 tweet["source"] can be an empty string simonw 9599 closed 0     0 2020-04-17T19:18:26Z 2020-04-17T22:01:44Z 2020-04-17T22:01:44Z MEMBER  

Got this excepion: File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py", line 641, in extract_and_save_source details = m.groupdict() AttributeError: 'NoneType' object has no attribute 'groupdict' I traced it back to this tweet: https://twitter.com/osder/status/578712651393576960 ``` (Pdb) source_re re.compile('(?P<name>.*?)') (Pdb) locals()['source'] '' (Pdb) u

/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(393)save_tweets() -> tweet["source"] = extract_and_save_source(db, tweet["source"]) (Pdb) tweet {'created_at': '2015-03-20T00:20:22+00:00', 'id': 578712651393576960, 'full_text': '@osder', 'truncated': False, 'display_text_range': [0, 6], 'source': '', 'in_reply_to_status_id': 578712521382715392, 'in_reply_to_user_id': 1545741, 'in_reply_to_screen_name': 'osder', 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': False, 'retweet_count': 0, 'favorite_count': 0, 'favorited': False, 'retweeted': False, 'lang': 'und', 'user': 1545741} ```

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
591613579 MDU6SXNzdWU1OTE2MTM1Nzk= 41 Bug: recorded a since_id for None, None simonw 9599 closed 0     0 2020-04-01T04:29:43Z 2020-04-01T04:31:11Z 2020-04-01T04:31:11Z MEMBER  

This shouldn't happen in the since_ids table (relates to #39):

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
590669793 MDU6SXNzdWU1OTA2Njk3OTM= 40 Feature: record history of follower counts simonw 9599 closed 0     5 2020-03-30T23:32:28Z 2020-04-01T04:13:05Z 2020-04-01T04:13:05Z MEMBER  

We currently over-write the follower count every time we import a tweet (when we import that user profile again):

https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/utils.py#L293-L294

It would be neat if we noticed if that user's follower count (and maybe other counts?) had changed since we last saved them and recorded that change in a separate history table. This would be an inexpensive way of building up rough charts of follower count over time.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
492297930 MDU6SXNzdWU0OTIyOTc5MzA= 10 Rethink progress bars for various commands simonw 9599 closed 0     5 2019-09-11T15:06:47Z 2020-04-01T03:45:48Z 2020-04-01T03:45:48Z MEMBER  

Progress bars and the --silent option are implemented inconsistently across commands at the moment.

This is made more challenging by the fact that for many operations the total length is not known.

https://click.palletsprojects.com/en/7.x/api/#click.progressbar

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
590666760 MDU6SXNzdWU1OTA2NjY3NjA= 39 --since feature can be confused by retweets simonw 9599 closed 0     11 2020-03-30T23:25:33Z 2020-04-01T03:45:16Z 2020-04-01T03:45:16Z MEMBER  

If you run twitter-to-sqlite user-timeline ... --since it's supposed to fetch Tweets those specific users tweeted since last time the command was run.

It does this by seeking out the max ID of their previous tweets:

https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311

BUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
490803176 MDU6SXNzdWU0OTA4MDMxNzY= 8 --sql and --attach options for feeding commands from SQL queries simonw 9599 closed 0     4 2019-09-08T20:35:49Z 2020-03-20T23:13:01Z 2020-03-20T23:13:01Z MEMBER  

Say you want to fetch Twitter profiles for a list of accounts that are stored in another database:

$ twitter-to-sqlite users-lookup users.db --attach attending.db \
    --sql "select Twitter from attending.attendes where Twitter is not null"

The SQL query you feed in is expected to return a list of screen names suitable for processing further by the command.

Should be supported by all three of:

  • [x] twitter-to-sqlite users-lookup
  • [x] twitter-to-sqlite user-timeline
  • [x] twitter-to-sqlite followers and friends

The --attach option allows other SQLite databases to be attached to the connection. Without it the SQL query will have to read from the single attached database.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
585306847 MDU6SXNzdWU1ODUzMDY4NDc= 36 twitter-to-sqlite followers/friends --sql / --attach simonw 9599 closed 0     0 2020-03-20T20:20:33Z 2020-03-20T23:12:38Z 2020-03-20T23:12:38Z MEMBER  

Split from #8. The friends and followers commands don't yet support --sql and --attach.

(friends-ids and followers-ids do though).

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/36/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
585359363 MDU6SXNzdWU1ODUzNTkzNjM= 38 Screen name display for user-timeline is uneven simonw 9599 closed 0     1 2020-03-20T22:30:23Z 2020-03-20T22:37:17Z 2020-03-20T22:37:17Z MEMBER  

CDPHE [####################################] 67 CHFSKy [####################################] 3216 DHSWI [####################################] 41 DPHHSMT [####################################] 742 Delaware_DHSS [####################################] 3231 DhhsNevada [####################################] 639 I could format them to match the length of the longest screen name instead.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/38/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
585282212 MDU6SXNzdWU1ODUyODIyMTI= 35 twitter-to-sqlite user-timeline [screen_names] --sql / --attach simonw 9599 closed 0     5 2020-03-20T19:26:07Z 2020-03-20T20:17:00Z 2020-03-20T20:16:35Z MEMBER  

Split from #8.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/35/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
585266763 MDU6SXNzdWU1ODUyNjY3NjM= 34 IndexError running user-timeline command simonw 9599 closed 0     2 2020-03-20T18:54:08Z 2020-03-20T19:20:52Z 2020-03-20T19:20:37Z MEMBER  

$ twitter-to-sqlite user-timeline data.db --screen_name Allen_Joines Traceback (most recent call last): File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite", line 11, in <module> load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py", line 256, in user_timeline utils.save_tweets(db, chunk) File "/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py", line 289, in save_tweets db["users"].upsert(user, pk="id", alter=True) File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py", line 1128, in upsert conversions=conversions, File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py", line 1157, in upsert_all upsert=True, File "/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py", line 1096, in insert_all row = list(self.rows_where("rowid = ?", [self.last_rowid]))[0] IndexError: list index out of range

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
561454071 MDU6SXNzdWU1NjE0NTQwNzE= 32 Documentation for " favorites" command simonw 9599 closed 0     0 2020-02-07T06:50:11Z 2020-02-07T06:59:10Z 2020-02-07T06:59:10Z MEMBER  

It looks like I forgot to document this one in the README.

https://github.com/dogsheep/twitter-to-sqlite/blob/6ebd482619bd94180e54bb7b56549c413077d329/twitter_to_sqlite/cli.py#L183-L194

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/32/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
518725064 MDU6SXNzdWU1MTg3MjUwNjQ= 29 `import` command fails on empty files jacobian 21148 closed 0     4 2019-11-06T20:34:26Z 2019-11-09T20:33:38Z 2019-11-09T19:36:36Z CONTRIBUTOR  

If a file in the export is empty (in my case it was account-suspensions.js), twitter-to-sqlite import fails:

$ twitter-to-sqlite import twitter.db ~/Downloads/twitter-2019-11-06-926f4f3be4b3b1fcb1aa387c40cd14f7c8aaf9bbcdb2d78ac14d9989add501bb.zip Traceback (most recent call last): File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite", line 10, in <module> sys.exit(cli()) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py", line 627, in import_ archive.import_from_file(db, filename, content) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/archive.py", line 224, in import_from_file db[table_name].upsert_all(rows, hash_id="pk") File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py", line 1113, in upsert_all extracts=extracts, File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py", line 980, in insert_all first_record = next(records) StopIteration

This appears to be because db.upsert_all is called with no rows -- I think?

I hacked around this by modifying import_from_file to have an if rows: clause:

for table, rows in to_insert.items(): if rows: table_name = "archive_{}".format(table.replace("-", "_")) ...

I'm happy to work up a real PR if that's the right approach, but I'm not sure it is.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
515658861 MDU6SXNzdWU1MTU2NTg4NjE= 28 Add indexes to followers table simonw 9599 closed 0     1 2019-10-31T18:40:22Z 2019-11-09T20:15:42Z 2019-11-09T20:11:48Z MEMBER  

select follower_id from following where followed_id = 12497 takes over a second for me at the moment.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
518739697 MDU6SXNzdWU1MTg3Mzk2OTc= 30 `followers` fails because `transform_user` is called twice jacobian 21148 closed 0     2 2019-11-06T20:44:52Z 2019-11-09T20:15:28Z 2019-11-09T19:55:52Z CONTRIBUTOR  

Trying to run twitter-to-sqlite followers errors out:

Traceback (most recent call last): File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite", line 10, in <module> sys.exit(cli()) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py", line 130, in followers go(bar.update) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py", line 116, in go utils.save_users(db, [profile]) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py", line 302, in save_users transform_user(user) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py", line 181, in transform_user user["created_at"] = parser.parse(user["created_at"]) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 1374, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 646, in parse res, skipped_tokens = self._parse(timestr, **kwargs) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 725, in _parse l = _timelex.split(timestr) # Splits the timestr into tokens File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 207, in split return list(cls(s)) File "/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py", line 76, in __init__ '{itype}'.format(itype=instream.__class__.__name__)) TypeError: Parser must be a string or character stream, not datetime

This appears to be because https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L111 calls transform_user, and then https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116 calls transform_user again, which fails because the user is already transformed.

I was able to work around this by commenting out https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116.

Shall I work up a patch for that, or is there a better approach?

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
488833975 MDU6SXNzdWU0ODg4MzM5NzU= 3 Command for running a search and saving tweets for that search simonw 9599 closed 0     6 2019-09-03T21:29:56Z 2019-11-04T05:31:56Z 2019-11-04T05:31:16Z MEMBER  
$ twitter-to-sqlite search dogsheep
twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
514459062 MDU6SXNzdWU1MTQ0NTkwNjI= 27 retweets-of-me command simonw 9599 closed 0     4 2019-10-30T07:43:01Z 2019-11-03T01:12:58Z 2019-11-03T01:12:58Z MEMBER  

https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets_of_me

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
513074501 MDU6SXNzdWU1MTMwNzQ1MDE= 26 Command for importing mentions timeline simonw 9599 closed 0     1 2019-10-28T03:14:27Z 2019-10-30T02:36:13Z 2019-10-30T02:20:47Z MEMBER  

https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-mentions_timeline

Almost identical to home-timeline #18 but it uses https://api.twitter.com/1.1/statuses/mentions_timeline.json instead.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
506268945 MDU6SXNzdWU1MDYyNjg5NDU= 20 --since support for various commands for refresh-by-cron simonw 9599 closed 0     3 2019-10-13T03:40:46Z 2019-10-21T03:32:04Z 2019-10-16T19:26:11Z MEMBER  

I want to run a cron that updates my Twitter database every X minutes.

It should be able to retrieve the following without needing to paginate through everything:

  • [x] Tweets I have tweeted
  • [x] My home timeline (see #19)
  • [x] Tweets I have favourited

It would be nice if this could be standardized across all commands as a --since option.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
508190730 MDU6SXNzdWU1MDgxOTA3MzA= 23 Extremely simple migration system simonw 9599 closed 0     2 2019-10-17T02:13:57Z 2019-10-17T16:57:17Z 2019-10-17T16:57:17Z MEMBER  

Needed for #12. This is going to be an incredibly simple version of the Django migration system.

  • A migrations table, keeping track of which migrations were applied (and when)
  • A migrate() function which applies any pending migrations
  • A MIGRATIONS constant which is a list of functions to be applied

The function names will be detected and used as the names of the migrations.

Every time you run the CLI tool it will call the migrate() function before doing anything else.

Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
508578780 MDU6SXNzdWU1MDg1Nzg3ODA= 25 Ensure migrations don't accidentally create foreign key twice simonw 9599 closed 0     2 2019-10-17T16:08:50Z 2019-10-17T16:56:47Z 2019-10-17T16:56:47Z MEMBER  

Is it possible for these lines to run against a database table that already has these foreign keys?

https://github.com/dogsheep/twitter-to-sqlite/blob/c9295233f219c446fa2085cace987067488a31b9/twitter_to_sqlite/migrations.py#L21-L22

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
503053800 MDU6SXNzdWU1MDMwNTM4MDA= 12 Extract "source" into a separate lookup table simonw 9599 closed 0     3 2019-10-06T05:17:23Z 2019-10-17T15:49:24Z 2019-10-17T15:49:24Z MEMBER  

It's pretty bulky and ugly at the moment:

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
506087267 MDU6SXNzdWU1MDYwODcyNjc= 19 since_id support for home-timeline simonw 9599 closed 0     3 2019-10-11T22:48:24Z 2019-10-16T19:13:06Z 2019-10-16T19:12:46Z MEMBER  

Currently every time you run home-timeline we pull all 800 available tweets. We should offer to support since_id (which can be provided or can be pulled directly from the database) in order to work more efficiently if this command is executed e.g. on a cron.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
508024032 MDU6SXNzdWU1MDgwMjQwMzI= 22 Ability to import from uncompressed archive or from specific files simonw 9599 closed 0     0 2019-10-16T18:31:57Z 2019-10-16T18:53:36Z 2019-10-16T18:53:36Z MEMBER  

Currently you can only import like this:

$ twitter-to-sqlite import path-to-twitter.zip

It would be useful if you could import from a folder that was decompressed from that zip:

$ twitter-to-sqlite import path-to-twitter/

AND from individual files within that folder - since that would allow you to e.g. selectively import certain files:

$ twitter-to-sqlite import path-to-twitter/favorites.js path-to-twitter/tweets.js
twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/22/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
506432572 MDU6SXNzdWU1MDY0MzI1NzI= 21 Fix &amp; escapes in tweet text simonw 9599 closed 0     1 2019-10-14T03:37:28Z 2019-10-15T18:48:16Z 2019-10-15T18:48:16Z MEMBER  

Shouldn't be storing & here.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
503244410 MDU6SXNzdWU1MDMyNDQ0MTA= 14 When importing favorites, record which user favorited them simonw 9599 closed 0     0 2019-10-07T05:45:11Z 2019-10-14T03:30:25Z 2019-10-14T03:30:25Z MEMBER  

This code currently just dumps them into the tweets table without recording who it was who had favorited them.

https://github.com/dogsheep/twitter-to-sqlite/blob/436a170d74ec70903d1b4ca430c2c6b6435cdfcc/twitter_to_sqlite/cli.py#L152-L157

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/14/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
505928530 MDU6SXNzdWU1MDU5Mjg1MzA= 18 Command to import home-timeline simonw 9599 closed 0     4 2019-10-11T15:47:54Z 2019-10-11T16:51:33Z 2019-10-11T16:51:12Z MEMBER  

Feature request: https://twitter.com/johankj/status/1182563563136868352

Would it be possible to save all tweets in my timeline from the last X days? I would love to see how big a percentage some users are of my daily timeline as a metric on whether I should unfollow them/move them to a list.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
505674949 MDU6SXNzdWU1MDU2NzQ5NDk= 17 import command should empty all archive-* tables first simonw 9599 closed 0     2 2019-10-11T06:58:43Z 2019-10-11T15:40:08Z 2019-10-11T15:40:08Z MEMBER  

Can have a CLI option for NOT doing that.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
505673645 MDU6SXNzdWU1MDU2NzM2NDU= 16 Do a better job with archived direct message threads simonw 9599 open 0     0 2019-10-11T06:55:21Z 2019-10-11T06:55:27Z   MEMBER  

https://github.com/dogsheep/twitter-to-sqlite/blob/fb2698086d766e0333a55bb73435e7283feeb438/twitter_to_sqlite/archive.py#L98-L99

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/16/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
488835586 MDU6SXNzdWU0ODg4MzU1ODY= 4 Command for importing data from a Twitter Export file simonw 9599 closed 0     2 2019-09-03T21:34:13Z 2019-10-11T06:45:02Z 2019-10-11T06:45:02Z MEMBER  

Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data

A command for importing this data into SQLite would be extremely useful.

$ twitter-to-sqlite import twitter.db path-to-archive.zip
twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
503085013 MDU6SXNzdWU1MDMwODUwMTM= 13 statuses-lookup command simonw 9599 closed 0     1 2019-10-06T11:00:20Z 2019-10-07T00:33:49Z 2019-10-07T00:31:44Z MEMBER  

For bulk retrieving tweets by their ID.

https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-lookup

Rate limit is 900/15 minutes (1 call per second) but each call can pull up to 100 IDs, so we can pull 6,000 per minute.

Should support --SQL and --attach #8

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
503045221 MDU6SXNzdWU1MDMwNDUyMjE= 11 Commands for recording real-time tweets from the streaming API simonw 9599 closed 0     1 2019-10-06T03:09:30Z 2019-10-06T04:54:17Z 2019-10-06T04:48:31Z MEMBER  

https://developer.twitter.com/en/docs/tweets/filter-realtime/api-reference/post-statuses-filter

We can support tracking keywords and following specific users.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
491791152 MDU6SXNzdWU0OTE3OTExNTI= 9 followers-ids and friends-ids subcommands simonw 9599 closed 0     1 2019-09-10T16:58:15Z 2019-09-10T17:36:55Z 2019-09-10T17:36:55Z MEMBER  

These will import follower and friendship IDs into the following tables, using these APIs:

https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-followers-ids https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-ids

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
490798130 MDU6SXNzdWU0OTA3OTgxMzA= 7 users-lookup command for fetching users simonw 9599 closed 0     0 2019-09-08T19:47:59Z 2019-09-08T20:32:13Z 2019-09-08T20:32:13Z MEMBER  

https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup https://api.twitter.com/1.1/users/lookup.json?user_id=783214,6253282 https://api.twitter.com/1.1/users/lookup.json?screen_name=simonw,cleopaws CLI design: $ twitter-to-sqlite users-lookup simonw cleopaws $ twitter-to-sqlite users-lookup 783214 6253282 --ids

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
489419782 MDU6SXNzdWU0ODk0MTk3ODI= 6 Extract extended_entities into a media table simonw 9599 closed 0     0 2019-09-04T21:59:10Z 2019-09-04T22:08:01Z 2019-09-04T22:08:01Z MEMBER  

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
488833136 MDU6SXNzdWU0ODg4MzMxMzY= 1 Imported followers should go in "users", relationships in "following" simonw 9599 closed 0     0 2019-09-03T21:27:37Z 2019-09-04T20:23:04Z 2019-09-04T20:23:04Z MEMBER  

Right now twitter-to-sqlite followers dumps everything in a followers table, and doesn't actually record which account they are following!

It should instead save them all in a global users table and then set up m2m relationships in a following table. This also means it should create a record for the specified user in order to record both sides of each relationship.

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
488833698 MDU6SXNzdWU0ODg4MzM2OTg= 2 "twitter-to-sqlite user-timeline" command for pulling tweets by a specific user simonw 9599 closed 0     3 2019-09-03T21:29:12Z 2019-09-04T20:02:11Z 2019-09-04T20:02:11Z MEMBER  

Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html

I'm going to do:

$ twitter-to-sqlite tweets simonw
twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
488874815 MDU6SXNzdWU0ODg4NzQ4MTU= 5 Write tests that simulate the Twitter API simonw 9599 open 0     1 2019-09-03T23:55:35Z 2019-09-03T23:56:28Z   MEMBER  

I can use betamax for this: https://pypi.org/project/betamax/

twitter-to-sqlite 206156866 issue    
{
    "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Queries took 64.519ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows