We took a pragmatic approach. There is 1Tb free per month on BigQuery and per project. We have about 5 developers per project involved in the data manipulation. So, we set a limit to 10Gb per day and per user.
20 working day per month x 5 developer x 10 Gb = 1Tb.
In addition, we estimate that for Dev and Test environment, the dataset is limited, partitioned and clustered and the developers shouldn't query more than 10Gb per day.
Of course, for some projects, like the datalake project, we don't have this limit (because we also have slots and so a flat rate pricing).
For production, we also set a limit global on the project (not per user). We take the max amount of data processed per day on a month and with set a quota 1.5 time higher. And we review it every month.
It's our rule, according with our feeling, our risk management and our context. You can take inspiration but enforce it according with your context and requirements! (or budget!!)