home / github / issues

Menu
  • Search all tables
  • GraphQL API

issues: 1015646369

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association pull_request body repo type active_lock_reason performed_via_github_app reactions draft state_reason
1015646369 I_kwDOBm6k_c48iYih 1480 Exceeding Cloud Run memory limits when deploying a 4.8G database 110420 open 0     9 2021-10-04T21:20:24Z 2022-10-07T04:39:10Z   CONTRIBUTOR  

When I try to deploy a 4.8G SQLite database to Google Cloud Run, I get this error message:

Memory limit of 8192M exceeded with 8826M used. Consider increasing the memory limit, see https://cloud.google.com/run/docs/configuring/memory-limits

Unfortunately, the maximum amount of memory that can be allocated to an instance is 8192M.

Naively profiling the memory usage of running Datasette with this database locally on my MacBook shows the following memory usage (using Activity Monitor) when I just start up Datasette locally:

  • Real Memory Size: 70.6 MB
  • Virtual Memory Size: 4.51 GB
  • Shared Memory Size: 2.5 MB
  • Private Memory Size: 57.4 MB

I'm trying to understand if there's a query or other operation that gets run during container deployment that causes memory use to be so large and if this can be avoided somehow.

This is somewhat related to #1082, but on a different platform, so I decided to open a new issue.

107914493 issue    
{
    "url": "https://api.github.com/repos/simonw/datasette/issues/1480/reactions",
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
   

Links from other tables

  • 2 rows from issues_id in issues_labels
  • 9 rows from issue in issue_comments
Powered by Datasette · Queries took 0.947ms · About: github-to-sqlite