HOW BIGQUERY HAS HELPED CONDUIT
Here at Conduit, we were faced with a similar issue Google was when they set out on their journey to create BigQuery.
We hit a point where we needed our reports to be smoother and faster, and find a solution that could handle all our data from our 1,500 monthly clients.
This was not an easy feat, we needed a solution that was simplistic, reliable, affordable and not limited in its scaling and speed capabilities.
After months of researching, we narrowed our search down to BigQuery.
For us, BigQuery had time to value ratio we could not match with other applications, let alone when we compared it to our previous process.
At one point in time, we were pulling our client data into reports via Google and Excel sheets. This did not just pose an issue for us in its scaling capabilities when we hit a certain amount of data, it was as slow as molasses.
BigQuery solved that issue for us, with its pretty much limitless scaling capabilities as well as its ability to process petabytes of data in seconds!
Simplicity was another factor we valued heavily. We needed something we could get a grasp on and manage in house, helping us control costs and manipulate our data in a way we wanted.
BigQuery provides us with the infrastructure to not only store our data but transform it utilizing queries so that we can connect our data sources in as lightweight of a way as possible. Helping us control costs in the long run.
Touching back on our need for scalability, with 1,500 clients monthly and most clients running multiple campaigns with us we were in need of an application that could scale with our tremendous growth.
BigQuery was designed with that in mind, having virtually no limitations on scalability with the combination of its low costs is a great price point for analytics data.
We were not just looking for a solution that was simple and had massive potential for scalability it also needed to be blazing fast to give us the ability to have real time reporting.
BigQuery’s serverless infrastructure provides us with the speeds we need to accomplish all of our reporting needs.
The infrastructure provided by BigQuery would be unreachable for most companies. Access to that type of computing power is not common and would cost companies an unfathomable amount of money to accomplish (a bit of an exaggeration but you get the point).
With BigQuery’s capability to run a 4.06TB query in roughly 24 seconds, our search for a solution that had the speed capabilities we required ended there.
Reliability also played a massive role in our search. The need for a reliable infrastructure to eliminate the need for our own was top of mind. BigQuery provided us with exactly that.
According to Google, this is exactly how they accomplish their unmatched reliability, “BigQuery has a geographically diverse team of Site Reliability Engineers (SREs) who monitor the service 24/7 for outages, performance degradation, latency, and failures.
SREs track the service against internal SLOs, which are often much stricter than public SLAs. We are also able to help customers research not-so-obvious SQL issues.
The BigQuery team works behind the scenes to help ensure that you get the most current software stack running on fantastic infrastructure.
To that end, we may seamlessly migrate your queries to a different data center (while of course respecting the dataset location constraints you’ve set, e.g., if you’ve asked that it remains in Europe).
This means that your BigQuery queries may run in one data center in your region in the morning, and in another data center in the afternoon, as we roll out a new version of Dremel, upgrade networking or hardware or implement a new compression algorithm.”
Simplicity, speed, and reliability were not all we were looking for; we needed security above all.
With over 1,500 clients and all the data that comes with that volume, we needed to make sure our clients were protected.
BigQuery makes this possible by encrypting projects and datasets through Google’s cloud-wide identity and access management (IAM).