CloudDevelopersDevOpsFeaturedLet's TalkOpen SourceSecurity

Secure Your Software Supply Chain, SBOMs With Codenotary Cloud

0

Guest: Dennis Zimmer (LinkedIn)
Company: Codenotary (LinkedIn, Twitter)
Keywords: Software Supply Chain Security, Open Source, Immudb
Show: Let’s Talk

Summary: Codenotary is a provider of open source, end-to-end cryptographically verifiable tracking and provenance for all artifacts, actions, and dependencies. With increasing focus on software supply chain security and integrity, Codenotary solutions are playing an even more important role in the model tech stack. We invited Codenotary CTO Dennis Zimmer to join us and talk about Codenotary Cloud, how it helps companies ‘secure’ their assets and artifacts with great ease (cloud native is already very complicated, you don’t want to add another layer of complexity there) and how he sees the role of Immudb evolving over time as all – or most – of the online transactions will become tamper-proof.

“Codenotary Cloud is our platform that allows every company and every customer to protect their software supply chain,” said Zimmer. “Customers can use it for securing any kind of digital artifact that they have in their supply chain, starting from source codes to container images, to the actual build, but also regarding the software bill of materials (SBOMs).”

Highlights of the show: 

  • What is Codenotary Cloud and how would you compare it with what a vulnerability scanner does?
  • Where does Codenotary Cloud fit in the Software Supply Chain?
  • How is Codenotary Cloud related to immudb?

About Dennis Zimmer: Dennis Zimmer has a strong reputation as a technology visionary across the Globe. He was founder and CEO of Opvizor Inc., a virtualization monitoring company. He’s been working for over 20 years in the IT industry, wrote 10 books and hundreds of magazine articles and video trainings, that are read and used by leading IT professionals. Dennis has been awarded the VMware vExpert recognition (only 30 worldwide) for 11 years in a row. He’s also a thought leader within the Virtualization Community.

About Codenotary: Codenotary brings easy to use trust and integrity into the software lifecycle by providing end-to-end cryptographically verifiable tracking and provenance for all artifacts, actions, and dependencies. Codenotary can be set up in minutes and is fully integratable with modern CI/CD platforms. It is the only immutable and client-verifiable solution available that is capable of processing millions of transactions a second.


Here is the full unedited transcript of the show:

  • Swapnil Bhartiya: Hi, this is your host, Swapnil Bhartiya, and welcome to TFiR Let’s Talk. Today, we have with us, once again, Dennis Zimmer, CTO of Codenotary. Dennis, it’s great to have you on the show.

Dennis Zimmer: Thank you, Swapnil. Great to be here again and looking forward.

  • Swapnil Bhartiya: Today’s focus is Codenotary Cloud. So I want to learn a bit about it. What is it? Tell me more about it.

Dennis Zimmer: So Codenotary Cloud is our platform that allows every company and every customer to protect their software supply chain. So what Codenotary Cloud consists of is a multiple data structure and everything that is being written to this data structure is a transaction that actually is automatically recorded and orderable forever, so potentially forever. Customers can use it for securing any kind of digital artifact that they have in their supply chain. So starting from source codes to Kontena images, to the actual build, but also regarding the software bill of materials.

So everything that is part of your build or part of your Kontena image, or Kontena running Kontena, will also be recorded in a tamper-proof by Codenotary Cloud. It’s cloud-based because you can actually use it either as a managed service, so it’s publicly available, but you can also limit it just for your own organization. So we have both offerings in place. You can sync it, offer it as a complete replacement for everything. You can do from a security or digital signature perspective.

So you have different users or identifiers. It can be vulnerability scanner. It can be normal QA engineer. But it could also be your software pipeline software itself. What you gain eventually is a complete traceable workflow from source to production that is always maintaining not just the current state of your environment, but also the former state or historic state. So you can really go back in time or back and forth when it comes what happened, why has something been deployed and where or what does it contain at a given point in time.

  • Swapnil Bhartiya: So when I look at it, it looks like one shop for a lot of solutions, which also means that, I don’t want to say that, you’ll be putting a lot of competitors out of business. But if I ask you, how does it compare to other solutions out there? Or if you can be either diplomatic or you can be totally blunt that, hey, you know what, these are our competitors and the whole idea is to help users not about who your competitor is, to make life easier for your users or potential users, so they can just go to one place and address the big challenge that we are seeing in the supply chain issue today.

Dennis Zimmer: I think the biggest challenge that most companies currently have is really the complexity. So the complexity, how can I, for example, digitally trust something? Or how can I really make sure that my software build only contains what I really allow it or what I only want to have in my software component? So I would say a lot of functionality is really new. In our case, we started with all the technology about three years ago, and now come to this product that can be very easily integrated everywhere.

But I would also say that a couple of years ago, nobody really saw it to go so deep in the software supply chain until these recent attacks, or now one and a half years, until these attacks really started and now they’re becoming more and more popular. So I think at some point, it’s really about new challenges that companies face and that users face that we solve.

On the other hand, the first step was always about the simple security. So you make sure the integrity is okay, what you download is really what you download, or that you make sure that vulnerability scanning is fine. But only a few companies started to think of, “Okay, now what to do with this information and how can we make sense of this information when it comes to a bigger or a complete overview about everything we have up to the risk management.”

So risk management needs to decide, is our eCommerce shop affected or is it just a test environment that nobody’s really using or no real data is sitting there. Companies realize more and more that just the vulnerabilities scanner alone doesn’t cut it, and also just a secure pipeline doesn’t cut it. It needs to be a complete solution that has a complete overview, and even if you have a secure storage platform, it doesn’t mean that you can still trust what you trusted two years ago, or you stored two years ago.

So I wouldn’t say that we are ahead on competition, but of course, we cover some portions that other software products cover as well, like evidence store, like bill of materials. But in our case, it’s really that we don’t just store the bill of materials somewhere in a deep repository or in the file server. We should really store it in a immutable platform, so you can trust even the bill of material. That is something that gets more and more important, and I would say it’s key to survive all these supply chain attacks that are going to follow.

  • Swapnil Bhartiya: How does this compare to other solutions? I know there are scanners out there, if you can give what edge you have, what benefits are there as compared to other let’s say just vulnerability scanners?

Dennis Zimmer: So vulnerability scanner typically does just in time analyze. So your Kontena image or your binary is being scanned while it’s being built and then it’s being delivered. In our case, we first of all cannot just support one vulnerability scanner. We can actually use our own. Can use external vulnerability scanners that you have a full opinion of more security components and security component results, I would say, that are attached to your artifacts. But we also know this information over time, and because we store the uniqueness or the fingerprints of all the artifacts, we can even do continuous scanning.

So it’s not just about having a scanner result for one pipeline job. It’s having the information of all your pipeline jobs over time that also can change the opinion you have for a certain artifact. Imagine just sticking to the vulnerability scanner case, but you can also replace it with a compliance scanner or with any other kind of scanning tool. Just imagine that you have 20 pipelines using the same image, so, but they’re using it over the course of let’s say two weeks or three weeks.

Now, the vulnerability scanner today finds a vulnerability, but you already rolled out your 29 setups or and systems and builds over the last three weeks. In our case, it immediately pops up, hey, you already used it for the last 30 days. So it’s already deployed in 20 or 30 different environments based on this new information we got. So it’s really about across everything.

Another thing not to forget is when you don’t allow, for example, a Kontena image to be used in a different way, let’s say to upload or to download updates while running. You could untrust these Kontena images because the vulnerability scanner or because compliance scanner is coming back with this, the Kontena image behaves weird. So we can also track not just the situation itself, but we could also untrust permanently or temporary a certain component until it’s really rigorously checked and cleared.

  • Swapnil Bhartiya: How long does it take now to identifiable data software versus it used to traditionally take?

Dennis Zimmer: So it depends a bit on where something is affected. So it could take actually 280 days. That is something that is actually an average even. But at some point, maybe you’re lucky, you just deployed what is now being affected. So then of course it can be faster. But what we wanted to achieve with Codenotary is of course is that it’s always the same extremely fast time. So you instantly can find not just if something is now vulnerable, but maybe something just became unwanted, like there’s a license change, or there’s maybe you don’t have the contract with a certain party vendor anymore in place.

So there so many different situations where you just want to know where is it currently used and stop it from being used in the future. We provide all the tooling for customers to implement exactly that part. So a pipeline to give you a software build example, a pipeline is just not building anything anymore based on an unwanted component. So it’s automatically either using a wanted component or it’s blocking, so somebody needs to decide what to do next, but you can be assured it’s not going to be released in the production.

What you mentioned also came up regarding time. Time is probably the most important part. It doesn’t matter if something has been vulnerable for two weeks and you don’t know where it’s running. So you need to know actually where is something running. If you don’t know it, maybe it runs there forever. Something that I also noticed when discussing with different customers is a vulnerability scanner is not really looked at, because you have it in place, check marked, but what to do if there’s vulnerability in your image.

So some process are not even in place, and using Codenotary, you could solve two issues. One issue is you at least have a baseline, you have an idea where something actually have been deployed and what was the situation when it was deployed. So even if you come up, in hindsight, with a new compliance rule, you could immediately overlay it and then fix it or act on it and you don’t need to do everything at one step. You can really go and improve over time.

  • Swapnil Bhartiya: Excellent. Excellent. Yeah, it does matter. Now, second aspect about it in addition to timeliness is also cost. Sometime cost can also be a [inaudible 00:11:07]. Can you also talk about how does Codenotary Cloud also affect the cost that is associated with identifying vulnerability? Sometime time itself is a cost, but sometimes real cost is also there.

Dennis Zimmer: I think the biggest real cost is if you don’t know that you are affected. That is something that Log4j made pretty clear to a lot of companies. So some companies even need to delay their releases of a software that are actually their main revenue stream for example. Or as we have as a software vendor, we received from our customers request, “Hey, are you effected? Do you use Log4j? Do we need to update our solutions?”

When you spoke with these companies and customers, you found out they reached out to hundreds or even thousands of vendors, so, and that shows a little bit the desperation of something happens and now we need to actually find it because we don’t know what kind of damage could potentially, yeah, hit us. But what we know is Log4j is something that delayed releases, very likely also delayed revenue streams of hundreds of millions. When it comes to damage for a certain customer, just imagine your main web shop goes down on, yeah, just before Christmas or so, then of course the damage could be, yeah, 50% of your revenue of the year.

  • Swapnil Bhartiya: If you’re not a monopoly, then it also damages your brand and reputation. So it’s hard to attain customers as well. We talked about these two aspect, we talked [inaudible 00:12:49]. I also want to talk about a broader, wider picture of supply chain because we recently had a meeting here in White House as well that there was a lot of discussions around that. So if you look at Codenotary Cloud, is it like when you look at cause and effect, but you folks have been doing this for a while, so what is the timeliness of this announcement now? How does it fit into the whole, the discussion around supply chain security?

Dennis Zimmer: It’s extremely important. So that it happens, it all is already extremely important. I think the text in all the news in the last couple of months were already an eye opener. So I wouldn’t say it’s something nobody was aware of before, but now it becomes or gets a completely different level of seriousness. People are now not thinking about maybe it’s nice to have, it’s actually now about we need to protect our supply chain, but we also need to make sure that we know what our suppliers do and we also need to know and get information from them, what components they are using.

This discussion, I think, needs a level of governmental level. Otherwise, you’d never get really big companies and big software vendors to present you for every single a patch with a bill of materials as an example. But if you don’t know what is in the software that you are deploying, most companies don’t even develop their own software, they just use software from others, but it’s completely fine. But in this case, you also need to know what is in these software components and what is in my software that I’m currently running.

Because eventually from your risk perspective, you need to know how can I mitigate this risk, how much will it cost me if I don’t fix a certain issue, and probably the most important question, am I affected? This am I affected needs a bill of materials for all software and all patches. I think everything that is happening now from the government perspective drives also the vendors into this situation to provide this bill of materials information with their software. That is definitely one of the most important things to secure the software in the future.

  • Swapnil Bhartiya: Might seem like a different topic, but it’s not, is Codenotary is also, you folks are also a creator of Immudb. We have covered Immudb before. We had a lot of discussions around that, but I also want to understand what role will that play when we look at not just the software supply chain, but making the whole environment workloads secure as well.

Dennis Zimmer: Well, Immudb, of course, we developed, or when we started developing Immudb, our goal was to create the foundation for Codenotary Cloud. But Immudb itself is a database that stores everything dependable and tamper-proof. The tamper-proofness is not guaranteed by the database itself. It’s a combination of the database and all the clients connected and communicating with it. That also means Immudb scales extremely well and that is one of the most important things if you have an immutable platform, but it doesn’t scale well, then most of the use cases are going to drop out. So you cannot really use it for fast transactions.

We use it for fast transactions as well, because of pipeline, we have customers with several tens of thousands of bills a day. If your tamper-proof platform is slow, then you slow down the software development and that has a huge impact. But we didn’t want to have Immudb just for our use case. Therefore, we created it an Open Source project because there are so many different use cases from Fintech to compliance to a lot of regulation cases, or just simply storing invoices in a immutable fashion.

There are so many use cases and we see the community adopting Immudb extremely and receiving Immudb extremely well. I think we just crossed the 7,000 stars a couple of days ago and or even weeks ago, and it also shows based on the feedback we get from the community that there are so many different use cases people start to look into. There are even now companies looking into running web applications with Immudb in the backend, so to have a tamper-proof application as well.

So there are a lot of very interesting use cases now popping up left and right from the community. We are, of course, very thankful that the community is sharing all this information and ideas with us. But Immudb itself is probably the only database and definitely the only Open Source database that has been built up from scratch with immutability in mind. That is also the main and the most important feature of Immudb. So it’s not an attachment or plugin to it. It’s really the core.

  • Swapnil Bhartiya: Do you also see that at some point, no matter what kind of transaction is happening, I’m not just talking about financial transaction, database should be tamper-proof so that nobody should be able tamper with it and change things, what were intended. So do you think that future, that it should just become a default or that’s like, hey, that’s not the thing that we’re looking at, this is just specific for specific use cases.

Dennis Zimmer: Of course it would be great if it became a default. but immutability comes with something also with a downside and that means the amount of data, so, and of course there are also kind of information you want to lose after a certain period of time, so you don’t want to keep it forever. So I would definitely agree default for data for the time that needs to be tamper-proof or actually trustable, I would say that somebody can really trust this data and verify the data, at some point when it’s not so important anymore or it’s more for statistical purposes, then you can move into the normal immutable database space again.

So I think there are a lot of use cases, for sure, that it’s becoming a default is probably hard for the long term just because of the sheer amount of data you need to save. But I think immutable data structures are definitely the most important part to protect any kind of applications, and the application owner, or whatever you are based on the data classification, you need to decide how long this immutability and tamper-proofness needs to be, well, viable and existing. But for many, many different use case, and we see it in the whole blockchain ecosystem, just imagine when you have an NFT and suddenly the NFT is not verifiable anymore, then you basically lost all the money you spent on something like this.

So the whole world is going to change that for a certain period of time you need immutability and you need to have verifiable data. We want to provide exactly this Open Source project to make sure that all the internal, so on-premises, but also the cloud-based applications that don’t need to rely on the blockchain or whatever kind of service can immediately just use us. You mentioned embeddable, and Immudb is also embeddable. Currently, it’s also running in Raspberry Pis and in some other IoT devices already, even just to store the data for a certain or shorter period of time, tamper-proof before it’s being shipped to a central station.

What is also very interesting use case when you think of all kinds of IoT devices that have a certain state, and from time to time, they actually get delivered or connected to internet access or to the sync the data the central platform so we can make sure from Immudb DP perspective, at least while it’s out in the open, nobody can tamper it.

  • Swapnil Bhartiya: Dennis, thank you so much for taking time out today, and course, talk about Codenotary Cloud, but also Immudb, and more importantly, share your insights both in supply chain security and of course immutable data as well. As usual, I would love to have you back on the show, but thanks for your time today. Thank you.

Dennis Zimmer: I would love to. Thank you very much.

Don't miss out great stories, subscribe to our newsletter.

Meltano Is Building An Open Source DataOps OS, Foundation Of Ideal Data Stack

Previous article

Meet The Creator Of Wallaroo Labs

Next article
Login/Sign up