home / github

Menu
  • Search all tables
  • GraphQL API

issues

Table actions
  • GraphQL API for issues

10 rows where type = "issue" and user = 7908073 sorted by updated_at descending

✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)

state 2

  • closed 8
  • open 2

type 1

  • issue · 10 ✖

repo 1

  • sqlite-utils 10
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
1239034903 I_kwDOCGYnMM5J2iwX 433 CLI eats my cursor chapmanjacobd 7908073 closed 0     10 2022-05-17T18:52:52Z 2023-11-04T00:46:30Z 2023-11-04T00:46:30Z CONTRIBUTOR  

I'm not sure why this happens but sqlite-utils makes my terminal cursor disappear after running commands like sqlite-utils insert. I've only noticed this behavior in sqlite-utils, not in any other CLI tools

I can still type commands after it runs but the text cursor is invisible

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/433/reactions",
    "total_count": 5,
    "+1": 5,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1655860104 I_kwDOCGYnMM5ismuI 535 rows: --transpose or psql extended view-like functionality chapmanjacobd 7908073 closed 0     2 2023-04-05T15:37:33Z 2023-06-15T08:39:49Z 2023-06-14T22:05:28Z CONTRIBUTOR  

It would be nice if the rows subcommand had a flag, perhaps called --transpose which would print in long form instead of wide. Similar to extended display mode in psql (\x)

In other words instead of this:

sqlite-utils rows --limit 5 --fmt github track_metadata.db songs

| track_id | title | song_id | release | artist_id | artist_mbid | artist_name | duration | artist_familiarity | artist_hotttnesss | year | track_7digitalid | shs_perf | shs_work | |--------------------|-------------------|--------------------|--------------------------------------|--------------------|--------------------------------------|------------------|------------|----------------------|---------------------|--------|--------------------|------------|------------| | TRMMMYQ128F932D901 | Silent Night | SOQMMHC12AB0180CB8 | Monster Ballads X-Mas | ARYZTJS1187B98C555 | 357ff05d-848a-44cf-b608-cb34b5701ae5 | Faster Pussy cat | 252.055 | 0.649822 | 0.394032 | 2003 | 7032331 | -1 | 0 | | TRMMMKD128F425225D | Tanssi vaan | SOVFVAK12A8C1350D9 | Karkuteillä | ARMVN3U1187FB3A1EB | 8d7ef530-a6fd-4f8f-b2e2-74aec765e0f9 | Karkkiautomaatti | 156.551 | 0.439604 | 0.356992 | 1995 | 1514808 | -1 | 0 | | TRMMMRX128F93187D9 | No One Could Ever | SOGTUKN12AB017F4F1 | Butter | ARGEKB01187FB50750 | 3d403d44-36ce-465c-ad43-ae877e65adc4 | Hudson Mohawke | 138.971 | 0.643681 | 0.437504 | 2006 | 6945353 | -1 | 0 | | TRMMMCH128F425532C | Si Vos Querés | SOBNYVR12A8C13558C | De Culo | ARNWYLR1187B9B2F9C | 12be7648-7094-495f-90e6-df4189d68615 | Yerba Brava | 145.058 | 0.448501 | 0.372349 | 2003 | 2168257 | -1 | 0 | | TRMMMWA128F426B589 | Tangle Of Aspens | SOHSBXH12A8C13B0DF | Rene Ablaze Presents Winter Sessions | AREQDTE1269FB37231 | | Der Mystic | 514.298 | 0 | 0 | 0 | 2264873 | -1 | 0 |

The output would look something like this:

$ for col in (sqlite-columns track_metadata.db songs) sqlite-utils --fmt github track_metadata.db "select $col from songs order by rowid desc limit 5" end

| track_id | |--------------------| | TRYYYVU12903CD01E3 | | TRYYYDJ128F9310A21 | | TRYYYMG128F4260ECA | | TRYYYJO128F426DA37 | | TRYYYUS12903CD2DF0 | | title | |-------------------------------------| | Fernweh feat. Sektion Kuchikäschtli | | Faraday | | Novemba | | Jago Chhadeo | | O Samba Da Vida | | song_id | |--------------------| | SOWXJXQ12AB0189F43 | | SOLXGOR12A81C21EB7 | | SOHODZI12A8C137BB3 | | SOXQYIQ12A8C137FBB | | SOTXAME12AB018F136 | | release | |---------------------------------| | So Oder So | | The Trance Collection Vol. 2 | | Dub_Connected: electronic music | | Naale Baba Lassi Pee Gya | | Pacha V.I.P. | | artist_id | |--------------------| | AR7PLM21187B990D08 | | ARCMCOK1187B9B1073 | | ARZ3R6M1187B9AF750 | | ART5FZD1187B9A7FCF | | AR7Z4J81187FB3FC59 | | artist_mbid | |--------------------------------------| | 3af2b07e-c91c-4160-9bda-f0b9e3144ed3 | | 4ac5f3de-c5ad-475e-ad50-41f1ef9dba20 | | 8b97e9c8-61f5-4615-9a96-276f24204e34 | | 2357c400-9109-42b6-b3fe-9e2d9f8e3872 | | 9d50cb20-7e42-45cc-b0dd-154c3e92a577 | | artist_name | |----------------| | Texta | | Elude | | Gabriel Le Mar | | Kuldeep Manak | | Kiko Navarro | | duration | |------------| | 295.079 | | 484.519 | | 553.038 | | 244.166 | | 217.443 | | artist_familiarity | |----------------------| | 0.552977 | | 0.403668 | | 0.556918 | | 0.4015 | | 0.528617 | | artist_hotttnesss | |---------------------| | 0.454869 | | 0.256935 | | 0.336914 | | 0.374866 | | 0.411595 | | year | |--------| | 2004 | | 0 | | 0 | | 0 | | 0 | | track_7digitalid | |--------------------| | 8486723 | | 5472456 | | 2219291 | | 1632096 | | 7522478 | | shs_perf | |------------| | -1 | | -1 | | -1 | | -1 | | -1 | | shs_work | |------------| | 0 | | 0 | | 0 | | 0 | | 0 |

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/535/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1581090327 I_kwDOCGYnMM5ePYYX 529 Microsoft line endings chapmanjacobd 7908073 closed 0     1 2023-02-12T02:20:48Z 2023-06-14T23:12:12Z 2023-06-14T23:11:47Z CONTRIBUTOR  

sqlite-utils prints \r\n but it should probably print \n (unless the platform is detected as Windows?)

It has tripped me up a few times when piping the output of sqlite-utils to other programs:

$ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | cat -A /mnt/d7/file^M$ $ sqlite-utils --no-headers --csv ~/lb/fs/d.db 'select path from media limit 1' | tr -d '\r' | cat -A /mnt/d7/file$

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/529/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1740150327 I_kwDOCGYnMM5nuJY3 557 Aliased ROWID option for tables created from alter=True commands chapmanjacobd 7908073 closed 0     2 2023-06-04T05:29:28Z 2023-06-14T06:09:21Z 2023-06-05T19:26:26Z CONTRIBUTOR  

If you use INTEGER PRIMARY KEY column, the VACUUM does not change the values of that column. However, if you use unaliased rowid, the VACUUM command will reset the rowid values.

ROWID should never be used with foreign keys but the simple act of aliasing rowid to id (which is what happens when one does id integer primary key DDL) makes it OK.

It would be convenient if there were more options to use a string column (eg. filepath) as the PK, and be able to use it during upserts, but when creating a foreign key, to create an integer column which aliases rowid

I made an attempt to switch to integer primary keys here but it is not going well... In my usecase the path column is a business key. Yes, it should be as simple as including the id column in any select statement where I plan on using upsert but it would be nice if this could be abstracted away somehow https://github.com/chapmanjacobd/library/commit/788cd125be01d76f0fe2153335d9f6b21db1343c

https://github.com/chapmanjacobd/library/actions/runs/5173602136/jobs/9319024777

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/557/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1393202060 I_kwDOCGYnMM5TCpOM 496 devrel/python api: Pylance type hinting chapmanjacobd 7908073 open 0     4 2022-10-01T03:03:34Z 2023-05-03T05:53:27Z   CONTRIBUTOR  

Pylance is generally pretty good at figuring out stuff but sqlite-utils has some quirks which make type hinting kinda useless. Maybe you don't care but I thought I would bring it to your attention.

For example:

db["subs"].insert_all(subs, pk="index")

Cannot access member "insert_all" for type "View" Member "insert_all" is unknown

insert_all and all the other methods show up as a type issues because the program can't know whether something is a View or a Table. Fair enough. But that basically throws all type checking out the window.

pk="index" also shows up as a type issue:

Argument of type "Literal['index']" cannot be assigned to parameter "pk" of type "Default" in function "insert_all" "Literal['index']" is incompatible with "Default"

I think this is because DEFAULT is an empty class?

maybe a few small changes could be made to make the library more type-friendly

The interim solution is of course to turn off type hints completely for the line db["subs"].insert_all(subs, pk="index") # type: ignore

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/496/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   
1436539554 I_kwDOCGYnMM5Vn9qi 511 [insert_all, upsert_all] IntegrityError: constraint failed chapmanjacobd 7908073 closed 0     2 2022-11-04T19:21:48Z 2022-11-04T22:59:54Z 2022-11-04T22:54:09Z CONTRIBUTOR  

My understand is that INSERT OR IGNORE will ignore when inserts would cause duplicate keys so I'm not sure exactly why the error is raised from sqlite3.

``` import argparse from pathlib import Path

from xklb import db, utils from xklb.utils import log

def parse_args() -> argparse.Namespace: parser = argparse.ArgumentParser() parser.add_argument("database") parser.add_argument("dbs", nargs="*") parser.add_argument("--upsert") parser.add_argument("--db", "-db", help=argparse.SUPPRESS) parser.add_argument("--verbose", "-v", action="count", default=0) args = parser.parse_args()

if args.db:
    args.database = args.db
Path(args.database).touch()
args.db = db.connect(args)
log.info(utils.dict_filter_bool(args.__dict__))

return args

def merge_db(args, source_db): source_db = str(Path(source_db).resolve())

s_db = db.connect(argparse.Namespace(database=source_db, verbose=args.verbose))
for table in [s for s in s_db.table_names() if not "_fts" in s and not s.startswith("sqlite_")]:
    log.info("[%s]: %s", source_db, table)
    with s_db.conn:
        data = s_db[table].rows

    with args.db.conn:
        if args.upsert:
            args.db[table].upsert_all(data, pk=args.upsert.split(","), alter=True)
        else:
            args.db[table].insert_all(data, alter=True, replace=True)

def merge_dbs(): args = parse_args() for s_db in args.dbs: merge_db(args, s_db)

if name == "main": merge_dbs()

```

``` $ lb-dev merge video.db tube_71.db --upsert path -vv SQL: INSERT OR IGNORE INTO media VALUES(?); - params: ['https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz'] ... File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:3122, in Table.insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, hash_id_columns, alter, ignore, replace, truncate, extracts, conversions, columns, upsert, analyze) 3116 all_columns += [ 3117 column for column in record if column not in all_columns 3118 ] 3120 first = False -> 3122 self.insert_chunk( 3123 alter, 3124 extracts, 3125 chunk, 3126 all_columns, 3127 hash_id, 3128 hash_id_columns, 3129 upsert, 3130 pk, 3131 conversions, 3132 num_records_processed, 3133 replace, 3134 ignore, 3135 ) 3137 if analyze: 3138 self.analyze()

File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:2887, in Table.insert_chunk(self, alter, extracts, chunk, all_columns, hash_id, hash_id_columns, upsert, pk, conversions, num_records_processed, replace, ignore) 2885 for query, params in queries_and_params: 2886 try: -> 2887 result = self.db.execute(query, params) 2888 except OperationalError as e: 2889 if alter and (" column" in e.args[0]): 2890 # Attempt to add any missing columns, then try again

File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:484, in Database.execute(self, sql, parameters) 482 self._tracer(sql, parameters) 483 if parameters is not None: --> 484 return self.conn.execute(sql, parameters) 485 else: 486 return self.conn.execute(sql)

IntegrityError: constraint failed

/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py(484)execute() 482 self._tracer(sql, parameters) 483 if parameters is not None: --> 484 return self.conn.execute(sql, parameters) 485 else: 486 return self.conn.execute(sql) ```

sqlite3 --version 3.36.0 2021-06-18 18:36:39

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/511/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1430325103 I_kwDOCGYnMM5VQQdv 507 conn.execute: UnicodeEncodeError: 'utf-8' codec can't encode character chapmanjacobd 7908073 closed 0     1 2022-10-31T18:49:51Z 2022-11-01T00:40:17Z 2022-11-01T00:40:16Z CONTRIBUTOR  

I'm not really sure what caused this and it happened in the middle of my program (after running for 35775 seconds).

Extracting metadata 49.9% (chunk 9893 of 19831) ... File "/home/xk/.local/lib/python3.10/site-packages/xklb/fs_extract.py", line 90, in extract_chunk args.db["media"].insert_all(utils.list_dict_filter_bool(media), pk="path", alter=True, replace=True) File "/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py", line 3107, in insert_all self.insert_chunk( File "/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py", line 2872, in insert_chunk result = self.db.execute(query, params) File "/home/xk/.local/lib/python3.10/site-packages/sqlite_utils/db.py", line 483, in execute return self.conn.execute(sql, parameters) UnicodeEncodeError: 'utf-8' codec can't encode character '\udcc3' in position 62: surrogates not allowed

This might be relevant: https://stackoverflow.com/questions/31898353/python-cant-encode-with-surrogateescape

I'm going to try re-running with

`py def execute( self, sql: str, parameters: Optional[Union[Iterable, dict]] = None ) -> sqlite3.Cursor: """ Execute SQL query and return asqlite3.Cursor``.

    :param sql: SQL query to execute
    :param parameters: Parameters to use in that query - an iterable for ``where id = ?``
      parameters, or a dictionary for ``where id = :id``
    """
    try:
        if self._tracer:
            self._tracer(sql, parameters)
        if parameters is not None:
            return self.conn.execute(sql, parameters)
        else:
            return self.conn.execute(sql)
    except UnicodeEncodeError:
        sql = sql.encode('utf-8', 'surrogatepass').decode('utf-8')
        if parameters is not None:
            parameters = parameters.encode('utf-8', 'surrogatepass').decode('utf-8')
        return self.execute(sql, parameters)

```

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/507/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1393212964 I_kwDOCGYnMM5TCr4k 497 column_names chapmanjacobd 7908073 closed 0     1 2022-10-01T03:34:21Z 2022-10-25T21:09:28Z 2022-10-25T21:09:28Z CONTRIBUTOR  

It would be nice to have a column_names. Similar to table_names.

Or if you could get one or all of the following syntax to work for both Database and Table that might be even better:

Style 1 - if 'table1' in db - if 'col1' in db['table1']

Style 2 - if 'table1' in db.tables - if 'col1' in db['table1'].columns

maybe the table ones actually work but I'm too lazy to check. I just know that I have to do:

[c.name for c in db['table1'].columns]

Edit: This is possible with columns_dict. I have actually used that before but I forgot about it. Feel free to close, but I do think accessing this data could be more consistent and intuitive.

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/497/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1361355564 I_kwDOCGYnMM5RJKMs 482 balanced table default column_order chapmanjacobd 7908073 closed 0     1 2022-09-05T03:00:18Z 2022-10-10T17:43:02Z 2022-09-06T20:17:27Z CONTRIBUTOR  

Is there any performance or size difference with column order in SQLITE ? similar to this https://www.cybertec-postgresql.com/en/column-order-in-postgresql-does-matter/

It might be interesting to have an option to create with an optimized column order. I'm assuming this would look something like INTEGER columns, REAL columns, BLOB columns, TEXT columns, NULL columns. NULL columns at the end because they are more likely to be TEXT and it is impossible to know if they will become INTEGER

(Of course, any schema evolution would reduce optimization but maybe column order could also be re-evaluated when schema changes)

edit:

this is easy to accomplish with the existing transform method:

int_columns = [k for k, v in table_columns.items() if v == int] db[table].transform(column_order=[*int_columns])

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/482/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed
1355193529 I_kwDOCGYnMM5Qxpy5 479 OperationalError: cannot VACUUM from within a transaction chapmanjacobd 7908073 open 0     0 2022-08-30T05:34:24Z 2022-08-30T05:34:24Z   CONTRIBUTOR  

Maybe when calling .vacuum() and other DB-level write-lock operations sqlite_utils could guard against this error message by automatically committing first?

``` 46 db["media"].optimize() # type: ignore ---> 47 db.vacuum()

File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:1047, in Database.vacuum(self) 1045 def vacuum(self): 1046 "Run a SQLite VACUUM against the database." -> 1047 self.execute("VACUUM;")

File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:470, in Database.execute(self, sql, parameters) 468 return self.conn.execute(sql, parameters) 469 else: --> 470 return self.conn.execute(sql)

OperationalError: cannot VACUUM from within a transaction ```

It might also be nice to add a sentence or two about how transactions are committed on the docs page. When I was swapping out my sqlite3 code for this library it was nice that everything was pretty much drop-in but I was/am unsure what to do about the places I explicitly call .commit() in my code

Related to https://github.com/simonw/sqlite-utils/issues/121

sqlite-utils 140912432 issue    
{
    "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/479/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [pull_request] TEXT,
   [body] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
, [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT);
CREATE INDEX [idx_issues_repo]
                ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
                ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
                ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
                ON [issues] ([user]);
Powered by Datasette · Queries took 57.545ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows