AI/MLCloud Native ComputingContributory BlogsDevelopersOpen Source

Is DIY-AI Right For You?

0

Imagine you’re planning a vacation to the beach. You could stay in an all-inclusive resort with a bevy of amenities–a full meal plan, spa, recreational activities, and more–designed to offer everything you might need or want to experience during your vacation. Or, maybe you could rent a vacation home. You wouldn’t get the same perks, but you’d probably be more inclined to explore and make the trip your own.

Both are great options. But one might be better for you than the other, depending on the type of vacation you want.

You can apply the same thought process to your approach to artificial intelligence (AI). True, we’re not talking about deciding between lounging near a massive pool or relaxing on a quiet oceanfront deck. But your AI options can often be boiled down to two choices:

  • An all-inclusive package featuring a handful of powerful data science software applications;
  • An “AI-as-a-Service” model that gives your team the freedom and flexibility to choose the tools it needs, when it needs them.

All-inclusive, but at what cost?

The first option is generally offered by larger vendors that strive to provide a complete set of resources for all of your AI needs. It can be great for big companies with extensive budgets and little time or inclination to experiment with each new technology that’s rolled out. Organizations can benefit from the convenience of having a wide array of AI options at their fingertips.

There are a couple of potential downsides to this option, however.

First, there’s the possibility of paying for more than you need while getting locked into those vendors’ products and ecosystems. You might not have the flexibility to choose other solutions, from other vendors, that better suit the unique needs of your MLOps teams, and you might never use some of the tools you’re paying for.

Second, you’re reliant on those vendors to keep their products up to date with new features and tools that will enable your team to keep its innovative edge. What if they fail to live up to their end of the bargain?

Choose your own adventure

If you’re uncomfortable with the possible answer to that question, you may want to look at option two, which involves more of a “DIY” approach to AI (DIY-AI, maybe?).

With an AI-as-a-Service model, you’re not locked into a particular vendor or predetermined set of offerings. Instead, your team members can choose specific tools for specific jobs. Those tools could be leading edge open source technologies, major software packages from large commercial vendors, smaller niche products designed for specific needs and tasks or some combination of the two.

For example, let’s say you’re forming a model operations team composed of developers, operations managers, and data scientists. Each group wants to be able to use their preferred technologies. Data scientists might want to use Pytorch or Tensorflow instead of another framework, depending on their needs, while developers might prefer Git, Argo CD or a similar solution for a particular project. Or a data engineer discovers some interesting data lineage capabilities in a tool like Pachyderm and wants to explore that tool more.

This may not be possible in an all-inclusive environment, which could limit your team’s ability to choose their own prescriptive technologies. But it is possible when AI is treated as another service offering. Presenting your team with AI solutions in an a la carte manner enables them to use the tools they want without being overburdened by solutions they do not need.

Using a common open source infrastructure makes it all possible. You can plug in different tools, from different vendors, for different needs. You’ll want to make sure to validate that these tools are certified to run on that underlying platform. Once you do, your developers, operations managers, and data scientists can select their own technologies and use the common platform as a basis for open collaboration on AI projects.

Using a guide

AI-as-a-Service also gives your teams complete control over the tools they use, especially if those tools originate from the open source community. They won’t have to wait for a commercial vendor to issue an update or fix, and they can adopt exciting new technologies as appropriate. For those who want to depend on a vendor for commercial support, there are many open source tools with commercial options available from software vendors or service providers.

This approach requires more of an effort on the part of model operations teams, and, as such, might not be an ideal option for some organizations. Teams must be willing to identify and select specific toolsets and keep up with changes happening in both the open source and commercial communities. That requires a certain level of expertise and willingness to experiment, which must be supported by the company.

However, working with a partner who can offer assistance—a form of “getting DIY help from your friends”—can lessen the burden. They can serve as a third-party connection point between the open source and commercial software communities and provide guidance and access to AI technologies.

It’s your choice

Again, both the all-inclusive and as-a-service options are viable paths to take, depending on your organization and your goals. The former is perfect for teams that want their journey laid out for them or do not have much need for wiggle room. The latter is ideal for those who seek a little more adventure and freedom.

Which path will you take?


By Will McGrath, Product Marketing Manager, Cloud Storage and Data Services, Red Hat