guillaume blaquiere
1 min readDec 2, 2020

We took a pragmatic approach. There is 1Tb free per month on BigQuery and per project. We have about 5 developers per project involved in the data manipulation. So, we set a limit to 10Gb per day and per user.

20 working day per month x 5 developer x 10 Gb = 1Tb.

In addition, we estimate that for Dev and Test environment, the dataset is limited, partitioned and clustered and the developers shouldn't query more than 10Gb per day.

Of course, for some projects, like the datalake project, we don't have this limit (because we also have slots and so a flat rate pricing).

For production, we also set a limit global on the project (not per user). We take the max amount of data processed per day on a month and with set a quota 1.5 time higher. And we review it every month.

It's our rule, according with our feeling, our risk management and our context. You can take inspiration but enforce it according with your context and requirements! (or budget!!)

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

guillaume blaquiere
guillaume blaquiere

Written by guillaume blaquiere

GDE cloud platform, Group Data Architect @Carrefour, speaker, writer and polyglot developer, Google Cloud platform 3x certified, serverless addict and Go fan.

No responses yet

Write a response