issues: 807817197
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
807817197 | MDU6SXNzdWU4MDc4MTcxOTc= | 229 | Hitting `_csv.Error: field larger than field limit (131072)` | 631242 | closed | 0 | 3 | 2021-02-13T19:52:44Z | 2021-02-14T21:33:33Z | 2021-02-14T21:33:33Z | NONE | I have a csv file where one of the fields is so large it is throwing an exception with this error and stops loading:
The stack trace occurs here: https://github.com/simonw/sqlite-utils/blob/3.1/sqlite_utils/cli.py#L633 There is a way to handle this that helps: https://stackoverflow.com/questions/15063936/csv-error-field-larger-than-field-limit-131072 One issue I had with this problem was sqlite-utils only provides limited context as to where the problem line is. There is the progress bar, but that is by percent rather than by line number. It would have been helpful if it could have provided a line number. Also, it would have been useful if it had allowed the loading to continue with later lines. |
140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/229/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed |