Qarbon Technologies wants to set standards for data center sustainability management
How do organizations measure sustainability? Most of them rely on matching the standards and requirements set by regulators. These standards are usually set by measuring and understanding data and using a solution that has been standardized for all.
While organizations can quickly achieve this, it is not a simple process for data center operators. Data centers continue to mushroom worldwide, given the need for more computing power. However, how is sustainability measured as the world builds more data centers?
True enough, data centers today are built with sustainability at their core. This includes using sustainable materials to make it and ensuring the data centers run sustainability with minimal carbon emissions.
But how are the standards measured? How do data centers prove that what they are doing is sustainable? Who do they compare themselves to? And who sets the bar?
Tech Wire Asia speaks to Robert Davidson, CEO and founder of Qarbon Technologies, to understand more about the importance of having standards for data center sustainability. Davidson also shares how the data from data centers can be used to do more than be a tool for sustainability management.
TWA: Why is there a need to set standards when it comes to measuring data center sustainability?
There will always be a balancing act between increasing demand and the desire to make data centers more sustainable, greener, and ESG-compliant. To make the right decisions about which data centers to use and which technologies to use, we need to have a way to access information about the data centers themselves.
Suppose you were to go to a given city that might have about 60 different data center operators operating in that city. And you were to go to them and say, who is this city’s most energy efficient, ESG compliant data center operator? Sixty hands will go up. Every operator has a little spreadsheet they’ve done in-house that says they are the best. There is no way to measure across evenly.
That’s one of the focuses of my company. The ESG product that we’re focused on is gathering the data from the other data center operators in the market, putting it into a standardized format, and then letting people decide how they view it.
Put simply, if organizations trust every data center operator to be ever better, the incentive is for them to improve their metrics by changing the spreadsheet. There should be a situation where businesses can look at it and measure it atomically.
Regulators can also understand who is green and who is moving the needle. It will allow regulators to open the industry up for things like green insurance green loans, and for customers to make the right buying decisions.
We’re still going to be in a situation where a customer may say, do I go 80% green, 100% green, or 40% green? That will go and impact their bottom line. But at least they get to make that decision correctly. And as we get things moving in the right direction and get things improved, we’ll be able to move to a point where we’re even more green and more sustainable. But until we have an even way of measuring it, we don’t have a solid ground to start from.
TWA: Are there current standards in place and how can the data be measured?
Well, there are a number of different standards out there that are used in several industries. And there has yet to be one that’s adopted for data centers. Let’s get the data modifiable so that you can apply all three or four different standards out there.
What we’re also seeing out in the EU, in Germany, for instance. The country has just passed some laws around creating sustainability standards, specifically for data centers. Their exact model will not be adopted globally, but it’s a start. And as soon as Germany does it, then Singapore will look at it; then the US will look at many different companies and start looking at that methodology.
To me, right now, step one, get the data and get it into a standard format. You can do side-by-side comparisons using the staring compare method and then be able to apply different calculations to that data to determine which is the better calculation. We still need to figure that out. But once you have it all in a single data lake, you can apply the different methods, and the correct answer will present itself.
TWA: Can data center operators make the data accessible?
Many of them have already made it available for their customers to use. But it’s all in different formats. Imagine a room with 30 different people, the operators, all speaking other languages. They’re all yelling simultaneously, saying the same things, but you can’t understand anything they’re saying.
My goal right now is to take all of those 30 languages, those 30 data feeds, and those 30 formats and get them into something consistent in an auditable fashion so everyone can trust that we’re not doing anything nefarious.
And then you can start to see, and we can make improvements. For me, the starting point is to gather what’s out there. And you’d be surprised how much is already out there and available.
For the data centers that don’t want to be part of it, there’s no incentive for them to do anything because there are no standards; there’s no natural way for them to show how good they’re doing. Once there’s a model, a method, a metric, or a rubric that they can measure against, then the incentive is there for them to start to comply.
Now, it’s a race to more efficiency. And really, the customers will dictate once you start to see the data become available. Once customers begin to consume it in an even fashion across multiple operators, that will drive the efficiency all on its own.
TWA: How many data center operators have signed up?
Our product launches soon. We already have four data center operators that we’re working with now. Without naming names, they are four of the top 10 big ones, and then we’ll be adding one to two every quarter going forward, depending on the demands and the needs of our customer base.
The goal is to keep adding and going until we have a significant amount of the market infrastructure on board. And if you think about it, from a data center operator standpoint, how many times has a data center operator had to throw bodies at just dealing with a customer because there’s a billing mismatch, inventory mismatch, or outage?
Suppose I can remove a lot of that. In that case, the data center operator can focus on running the best infrastructure possible and not have massive service management teams constantly explaining things back and forth and then negotiating with, in many cases, angry customers.
TWA: So what’s holding them back?
The major data center operators probably don’t see the need. They will likely look to comply with them at any point. They’re focused on presenting the data as they’re big enough and won’t change.
To me, the standards are more beneficial for the smaller operators. The operators with only five data centers might only be in a region, want to do something, and would love to have a playbook. This will enable them to do things in an organized framework, making it less costly.
It benefits some of the tier two and tier three players out there more than the big guys. But still, our goal is to make all data center operators easy to use and consume. The fundamental journey of Qarbon Technologies is to take data center infrastructure out of the real estate mindset and move it into the IT mindset.
One of the real detractions to the market is that we’re still viewing data centers the same way we book a hotel, a hospital, or a bus station. It’s a chunk of real estate that we have the infrastructure, and you’re stuck with whatever doors, windows, egress, and ingresses are available on that site. At the same time, these data center operations are a foundational piece of everyone’s global IT infrastructure.
We need to consider it a plugin component to the stack that runs the cloud and our financial environment. That’s what we’re looking to do. AI is driving tolerances in these data centers as we go denser and denser for the AI workloads that are going to these data centers. The denser you go, you’re cutting into those margins. It would help if you kept those tolerances tight to drive ever denser workloads into these data centers you’re now using and consuming. The infrastructure side of AI will require more access to this information.
We are the first to market with this API orchestration platform focused on data center infrastructure. So again, launching soon, bringing to market a way for people to consume better and more holistic way. The infrastructure is really driving the foundational components of their business, whether they like to admit it or not.
READ MORE
- Data Strategies That Dictate Legacy Overhaul Methods for Established Banks
- Securing Data: A Guide to Navigating Australian Privacy Regulations
- Ethical Threads: Transforming Fashion with Trust and Transparency
- Top 5 Drivers Shaping IT Budgets This Financial Year
- Beyond Connectivity: How Wireless Site Surveys Enhance Tomorrow’s Business Network