issues
27 rows where repo = 256834907, type = "issue" and user = 9599 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, body, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
602533481 | MDU6SXNzdWU2MDI1MzM0ODE= | 3 | Import EXIF data into SQLite - lens used, ISO, aperture etc | simonw 9599 | open | 0 | Apple Photos online and securely browsable 5324096 | 2 | 2020-04-18T19:24:31Z | 2021-10-05T12:38:24Z | MEMBER | dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
602585497 | MDU6SXNzdWU2MDI1ODU0OTc= | 7 | Integrate image content hashing | simonw 9599 | open | 0 | 2 | 2020-04-19T00:36:58Z | 2021-08-26T02:01:01Z | MEMBER | To spot duplicate images (where the file content differs such that the sha256 is no longer a match) it would be useful to calculate and store perceptual hashes of some sort. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/7/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 } |
||||||||
612151767 | MDU6SXNzdWU2MTIxNTE3Njc= | 15 | Expose scores from ZCOMPUTEDASSETATTRIBUTES | simonw 9599 | closed | 0 | 7 | 2020-05-04T20:36:07Z | 2020-12-20T04:44:22Z | 2020-05-05T00:11:45Z | MEMBER | The Apple Photos database has a |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
615474990 | MDU6SXNzdWU2MTU0NzQ5OTA= | 21 | bpylist.archiver.CircularReference: archive has a cycle with uid(13) | simonw 9599 | closed | 0 | 11 | 2020-05-10T20:58:06Z | 2020-12-19T07:44:49Z | 2020-05-10T21:57:13Z | MEMBER | ```
% python -i $(which photos-to-sqlite) apple-photos photos.db During handling of the above exception, another exception occurred: Traceback (most recent call last):
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/bin/photos-to-sqlite", line 11, in <module>
load_entry_point('photos-to-sqlite', 'console_scripts', 'photos-to-sqlite')()
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py", line 829, in call
return self.main(args, kwargs)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, ctx.params)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py", line 610, in invoke
return callback(args, **kwargs)
File "/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/cli.py", line 249, in apple_photos
photo_row = osxphoto_to_row(sha256, photo)
File "/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/utils.py", line 91, in osxphoto_to_row
place = photo.place
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py", line 614, in place
self._place = PlaceInfo5(self._info["reverse_geolocation"])
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py", line 505, in init
self._plrevgeoloc = archiver.unarchive(revgeoloc_bplist)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 16, in unarchive
return Unarchive(plist).top_object()
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 256, in top_object
return self.decode_object(self.top_uid)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 247, in decode_object
obj = klass.decode_archive(ArchivedObject(raw_obj, self))
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py", line 126, in decode_archive
mapItem = archive.decode("mapItem")
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 140, in decode
return self._unarchiver.decode_key(self._object, key)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 216, in decode_key
return self.decode_object(val)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 247, in decode_object
obj = klass.decode_archive(ArchivedObject(raw_obj, self))
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py", line 180, in decode_archive
sortedPlaceInfos = archive.decode("sortedPlaceInfos")
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 140, in decode
return self._unarchiver.decode_key(self._object, key)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 216, in decode_key
return self.decode_object(val)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 247, in decode_object
obj = klass.decode_archive(ArchivedObject(raw_obj, self))
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 112, in decode_archive
return [archive._decode_index(index) for index in uids]
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 112, in <listcomp>
return [archive._decode_index(index) for index in uids]
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 137, in _decode_index
return self._unarchiver.decode_object(index)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 247, in decode_object
obj = klass.decode_archive(ArchivedObject(raw_obj, self))
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py", line 217, in decode_archive
placeType = archive.decode("placeType")
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 140, in decode
return self._unarchiver.decode_key(self._object, key)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 216, in decode_key
return self.decode_object(val)
File "/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py", line 227, in decode_object
raise CircularReference(index)
bpylist.archiver.CircularReference: archive has a cycle with uid(13)
|
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
613006393 | MDU6SXNzdWU2MTMwMDYzOTM= | 20 | Ability to serve thumbnailed Apple Photo from its place on disk | simonw 9599 | closed | 0 | 10 | 2020-05-06T02:17:50Z | 2020-05-25T20:14:22Z | 2020-05-25T20:09:41Z | MEMBER | A custom Datasette plugin that can be run locally on a Mac laptop which knows how to serve photos such that they can be seen in the browser. Originally posted by @simonw in https://github.com/dogsheep/photos-to-sqlite/issues/19#issuecomment-624406285 |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
621332242 | MDU6SXNzdWU2MjEzMzIyNDI= | 25 | Create a public demo | simonw 9599 | closed | 0 | 5 | 2020-05-19T22:47:20Z | 2020-05-21T22:26:16Z | 2020-05-20T05:54:18Z | MEMBER | So I can show people what this does, using some of my photos. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/25/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
621486115 | MDU6SXNzdWU2MjE0ODYxMTU= | 27 | photos_with_apple_metadata view should include labels | simonw 9599 | open | 0 | 0 | 2020-05-20T06:06:17Z | 2020-05-20T06:06:17Z | MEMBER | Here's one way to add that:
|
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/27/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
621323348 | MDU6SXNzdWU2MjEzMjMzNDg= | 24 | Configurable URL for images | simonw 9599 | open | 0 | 1 | 2020-05-19T22:25:56Z | 2020-05-20T06:00:29Z | MEMBER | This is hard-coded at the moment, which is bad: https://github.com/dogsheep/photos-to-sqlite/blob/d5d69b9019703c47bc251444838578dd752801e2/photos_to_sqlite/cli.py#L269-L272 |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/24/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
621444763 | MDU6SXNzdWU2MjE0NDQ3NjM= | 26 | Rename project to dogsheep-photos | simonw 9599 | closed | 0 | 8 | 2020-05-20T04:12:34Z | 2020-05-20T04:31:02Z | 2020-05-20T04:30:40Z | MEMBER |
|
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/26/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
621280529 | MDU6SXNzdWU2MjEyODA1Mjk= | 23 | create-subset command for creating a publishable subset of a photos database | simonw 9599 | closed | 0 | 1 | 2020-05-19T20:58:20Z | 2020-05-19T22:32:48Z | 2020-05-19T22:32:37Z | MEMBER | I want to share a subset of my photos, without sharing everything. Idea:
So the command takes a SQL query that returns sha256 hashes, then creates a new file called |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/23/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
613002220 | MDU6SXNzdWU2MTMwMDIyMjA= | 19 | apple-photos command should work even if upload has not run | simonw 9599 | closed | 0 | 1 | 2020-05-06T02:02:25Z | 2020-05-19T20:59:59Z | 2020-05-19T20:59:59Z | MEMBER | I want people to be able to query their Apple Photos metadata without having to first run To do this I can have |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/19/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
615626118 | MDU6SXNzdWU2MTU2MjYxMTg= | 22 | Try out ExifReader | simonw 9599 | open | 0 | 4 | 2020-05-11T06:32:13Z | 2020-05-14T05:59:53Z | MEMBER | https://pypi.org/project/ExifReader/ New fork that should be able to handle EXIF in HEIC files. Forked here: https://github.com/ianare/exif-py/issues/102#issuecomment-626376522 Refs #3 |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
612860758 | MDU6SXNzdWU2MTI4NjA3NTg= | 18 | Switch CI solution to GitHub Actions with a macOS runner | simonw 9599 | open | 0 | 1 | 2020-05-05T20:03:50Z | 2020-05-05T23:49:18Z | MEMBER | Refs #17. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/18/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
612860531 | MDU6SXNzdWU2MTI4NjA1MzE= | 17 | Only install osxphotos if running on macOS | simonw 9599 | closed | 0 | 3 | 2020-05-05T20:03:26Z | 2020-05-05T20:20:05Z | 2020-05-05T20:11:23Z | MEMBER | The build is broken right now because you can't |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/17/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
612287234 | MDU6SXNzdWU2MTIyODcyMzQ= | 16 | Import machine-learning detected labels (dog, llama etc) from Apple Photos | simonw 9599 | open | 0 | 13 | 2020-05-05T02:45:43Z | 2020-05-05T05:38:16Z | MEMBER | Follow-on from #1. Apple Photos runs some very sophisticated machine learning on-device to figure out if photos are of dogs, llamas and so on. I really want to extract those labels out into my own database. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 1, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
602533300 | MDU6SXNzdWU2MDI1MzMzMDA= | 1 | Import photo metadata from Apple Photos into SQLite | simonw 9599 | open | 0 | Apple Photos online and securely browsable 5324096 | 8 | 2020-04-18T19:23:26Z | 2020-05-04T02:41:40Z | MEMBER | Faces, albums, locations, that kind of thing. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
608512747 | MDU6SXNzdWU2MDg1MTI3NDc= | 14 | Annotate photos using the Google Cloud Vision API | simonw 9599 | open | 0 | 5 | 2020-04-28T18:09:03Z | 2020-04-28T18:19:06Z | MEMBER | It can detect faces, run OCR, do image labeling (it knows what a lemur is!) and do object localization where it identifies objects and returns bounding polygons for them. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14/reactions", "total_count": 3, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 } |
||||||||
602533352 | MDU6SXNzdWU2MDI1MzMzNTI= | 2 | Ability to convert HEIC images to JPEG | simonw 9599 | closed | 0 | Apple Photos online and securely browsable 5324096 | 1 | 2020-04-18T19:23:43Z | 2020-04-28T16:47:21Z | 2020-04-28T16:47:21Z | MEMBER | dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
607888367 | MDU6SXNzdWU2MDc4ODgzNjc= | 13 | Also upload movie files | simonw 9599 | open | 0 | 2 | 2020-04-27T22:11:25Z | 2020-04-28T00:39:45Z | MEMBER | The Need to cover movies taken by my phone and DSLR too. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/13/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
606032950 | MDU6SXNzdWU2MDYwMzI5NTA= | 11 | Try running S3 uploads in a thread pool | simonw 9599 | closed | 0 | 0 | 2020-04-24T04:34:31Z | 2020-04-24T16:45:41Z | 2020-04-24T16:45:41Z | MEMBER | Since #10 provided such a speedup, can the same thing be done for the actual uploads? http://ls.pwd.io/2013/06/parallel-s3-uploads-using-boto-and-threads-in-python/ suggests it can really help performance. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/11/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
606033104 | MDU6SXNzdWU2MDYwMzMxMDQ= | 12 | If less than 500MB, show size in MB not GB | simonw 9599 | open | 0 | 1 | 2020-04-24T04:35:01Z | 2020-04-24T04:35:25Z | MEMBER | Just saw this:
|
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/12/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
606028272 | MDU6SXNzdWU2MDYwMjgyNzI= | 10 | Speed up hashing step using threads | simonw 9599 | closed | 0 | 0 | 2020-04-24T04:20:08Z | 2020-04-24T04:32:35Z | 2020-04-24T04:32:35Z | MEMBER | dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/10/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
605938063 | MDU6SXNzdWU2MDU5MzgwNjM= | 9 | upload command should be resumable, should only upload photos not already uploaded | simonw 9599 | closed | 0 | 2 | 2020-04-23T23:31:08Z | 2020-04-23T23:39:14Z | 2020-04-23T23:39:14Z | MEMBER | Follow on from #4. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/9/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
605147638 | MDU6SXNzdWU2MDUxNDc2Mzg= | 8 | Should I have used MD5 instead of SHA256? | simonw 9599 | closed | 0 | 2 | 2020-04-23T00:02:08Z | 2020-04-23T00:03:35Z | 2020-04-23T00:03:35Z | MEMBER | https://docs.aws.amazon.com/AmazonS3/latest/API/RESTCommonResponseHeaders.html
|
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/8/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602575575 | MDU6SXNzdWU2MDI1NzU1NzU= | 6 | Add progress bar to upload command | simonw 9599 | closed | 0 | 2 | 2020-04-18T23:32:41Z | 2020-04-19T00:15:24Z | 2020-04-19T00:15:24Z | MEMBER | Upload was added in #4 |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602533539 | MDU6SXNzdWU2MDI1MzM1Mzk= | 4 | Upload all my photos to a secure S3 bucket | simonw 9599 | closed | 0 | Apple Photos online and securely browsable 5324096 | 14 | 2020-04-18T19:24:50Z | 2020-04-18T21:58:11Z | 2020-04-18T21:57:13Z | MEMBER |
|
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
602551638 | MDU6SXNzdWU2MDI1NTE2Mzg= | 5 | photos-to-sqlite s3-auth command | simonw 9599 | closed | 0 | 1 | 2020-04-18T21:05:25Z | 2020-04-18T21:08:44Z | 2020-04-18T21:08:44Z | MEMBER | Modeled on |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);