home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

3 rows where issue = 506268945 sorted by updated_at descending

✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

user 1

  • simonw 3

issue 1

  • --since support for various commands for refresh-by-cron · 3 ✖

author_association 1

  • MEMBER 3
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
544335363 https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-544335363 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20 MDEyOklzc3VlQ29tbWVudDU0NDMzNTM2Mw== simonw 9599 2019-10-21T03:32:04Z 2019-10-21T03:32:04Z MEMBER

In case anyone is interested, here's an extract from the crontab I'm running these under at the moment: 1,11,21,31,41,51 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite user-timeline /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --since 2,7,12,17,22,27,32,37,42,47,52,57 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite home-timeline /home/ubuntu/timeline.db -a /home/ubuntu/auth.json --since 6,16,26,36,46,56 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite favorites /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --stop_after=50

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
--since support for various commands for refresh-by-cron 506268945  
542854749 https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-542854749 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20 MDEyOklzc3VlQ29tbWVudDU0Mjg1NDc0OQ== simonw 9599 2019-10-16T19:26:01Z 2019-10-16T19:26:01Z MEMBER

I'm not going to do this for "accounts that have followed me" and "new accounts that I have followed" - instead I will recommend running the friend_ids and followers_ids commands on a daily basis since that data doesn't really change much by the hour.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
--since support for various commands for refresh-by-cron 506268945  
541388038 https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-541388038 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20 MDEyOklzc3VlQ29tbWVudDU0MTM4ODAzOA== simonw 9599 2019-10-13T05:31:58Z 2019-10-13T05:31:58Z MEMBER

For favourites a --stop_after=200 option is probably good enough.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
--since support for various commands for refresh-by-cron 506268945  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 66.157ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows