CUSTOMER BACKGROUND

Bedrock Analytics, a US company, is one of the key players in the global market of retail analytics, in particular CPG (Consumer Packaged Goods) data analytics. The company’s customers are manufacturers of beverages, food products, detergents and cosmetics that compete for a share of sales in supermarkets.

 

Bedrock Analytics uses AI algorithms to gather, process, and analyze data that comes from a variety of sources, such as points of sales data, Syndicated Data Aggregators, Retail Store portals, e-commerce databases, consumer panel data sets, etc. In the input Big Data are gigantic arrays of information. In the output – a content-rich analytical extract that would take teams much longer to do in any other way. It is more storytelling than statistics. It is a set of tools that helps product manufacturers find an optimal way to customers.

 

Thanks to the digital products and services of Bedrock Analytics, both unique craft products can succeed in retail and the largest brands can maintain their market shares. Big Data based and well-designed solutions can give a boost to products that would otherwise struggle to compete in a highly competitive marketplace.

 

The company is listed in the top twenty global players of CPG analytics and occupies an expert position along with IBM, Microsoft, Manthan Software Services, Oracle, etc. So, when in August 2020 from Oakland, CA came a working inquiry from Navdip Bhachech, Senior Vice President Engineering at Bedrock Analytics, we were excited to work with his team. After all, we had heard a lot about their products! From the very beginning it looked incredible, and then turned into an inspiring DevOps experience that is worth sharing.

 

THE TASK

The market of CPG data analytics has been continuously growing. It is estimated that during 2021-2028 it will increase by 20%. The systems of Bedrock Analytics will have to process ever larger volumes of data. Their IT infrastructure should be able to handle that scope of operations, as the information provided by Bedrock Analytics is crucial for their customers to make their strategic business decisions on.

 

The initial scope of work included two main points: containerization and migration to Kubernetes in order to reduce Amazon bills. However, the requirements were complicated by their need for a multi-cloud solution: it was necessary to deploy the technology stack of Bedrock Analytics on two public clouds, AWS and Microsoft Azure, at the same time with the same code base. Their goal was to expand beyond the US and, with larger partners, required Microsoft’s Azure cloud infrastructure.

 

Among the factors that simplified the task implementation was clarity from the customer’s side – Navdip Bhachech understood the scope of work and deep requirements of Bedrock Analytics, so the communication was smooth. The challenge lay in the uniqueness of the solution.

 

What Bedrock Analytics wanted was the ability to deploy multi-cloud for the same application, which was quite uncommon. “Doing things multi-cloud means that you have to consider a lot of subtle differences between how AWS and Azure do things, commented Navdip Bhachech. If you want it to work the same way you have to tune it differently. And I think that was probably the most challenging part, keeping all of those details straight and organized in the work that was done.’’

 

The SHALB team has strong Kubernetes expertise and the experience of implementing unconventional solutions. We were chosen for this job because of our technical skills and the ability to understand the space of the project. Commenting on the process of choosing a contractor, Navdip Bhachech admits:

Upfront was the ability to deliver and we felt good after talking to the SHALB team. That was good technically, we knew they could do it.

Develop a multi-cloud platform with Kubernetes, Terraform and support of GitOps and Argo CD? Let us handle the task! SHALB engineers love such challenges!

 

PROJECT IMPLEMENTATION

Infrastructure audit

As of 2019, the system of data orchestration at Bedrock Analytics wasn’t optimal to process a large quantity of operations. Since it was based on AWS OpsWorks technology, the orchestration system was vendor-bound and could not be used in other clouds. This also caused inefficiencies and additional expenses. The product complexity and the outcomes of the infrastructure audit revealed the necessity for a new orchestration system that would be based on Kubernetes and cloud-native technologies: Argo CD, Prometheus, etc.

Working with repositories

Having decided on optimal workflow, SHALB came to an agreement with the development team in terms of working with repositories: GitFlow and Feature Branching. As a result, we created CI/CD pipelines and chose a monorepo concept for infrastructure development.

Workload estimation

Bedrock Analytics is a complex platform that solves hundreds of tasks concurrently. This means that there are approximately as many batch workloads for both analytics and orchestration. In this case it was decided to use an open-source engine Airflow atop of Kubernetes. Serhii Matiushenko, leading DevOps engineer at SHALB, was assigned to implement automation, prepare an infrastructure code, and integrate it with the existing system. Along with other tasks, the process took nearly 3 months to accomplish.

Development of Kubernetes manifests and Helm charts

Kubernetes primitives are necessary for the faultless operation and scaling of each workload. Describing the primitives with Helm charts and Kustomize appeared to be quite a time-consuming process.

Data orchestration

Since the specifics of Bedrock Analytics imply processing of batch workloads, they needed an orchestrator to run data science jobs. They chose Airflow having considered the experience of the development team and its suitability for python-based Machine Learning environments.

Describing cloud resources with Terraform code

Since there were no public Terraform modules available for Microsoft Azure, we had to develop them from scratch. Unlike Terraform modules for Amazon that are plentiful and have been constantly improved, modules for Microsoft are fewer and of poorer quality.

CI/CD development

The team used to have Jenkins, which required considerable effort from the developers to maintain. We chose Bitbucket CI/CD pipelines as an alternative to Jenkins, saving the developers time. With Bitbucket you have your CI/CD pipelines running at the same place where the code resides.

Documentation

At SHALB, we thoroughly document all stages of infrastructure development to make its further maintenance easier. The case of Bedrock Analytics was no exception.

Delivering the solution

After the implementation of the project tasks, we delivered the prepared solution to the customer. The delivery process took several stages:

  • POC
  • DevInfras
  • Staging
  • Production
  • Synchronized migration to AWS and Azure

PROJECT OUTCOME

As a result of the project, Bedrock Analytics received a more modern infrastructure, driven and managed by config. As Navdip Bhachech points out, they can now set up and tear down new environments much more easily and through source code. The flexibility of the new platform also provides for configurable scaling and tracking infrastructure changes, which enhances overall observability of the system. Their ability to manage cloud costs has also improved thanks to automation and better configuration of services.

 

The infrastructure solution designed by SHALB covered the customer’s business processes and had most of them automated. As a result, we managed to optimize the costs of their infrastructure maintenance and accelerate their product TTM due to the faster and simultaneous deployment to different clouds.

CUSTOMER FEEDBACK

Commenting on the project, Navdip Bhachech said:

Navdip Bhachech, Senior Vice President Engineering Bedrock Analytics — SHALB — Image

Navdip Bhachech, Senior Vice President Engineering Bedrock Analytics

What I liked about Volodymyr and Sergii is that they would figure things out. They were very proactive in finding the resolution to every little issue and figuring it out and doing the research. That was nice to see. We were leaning on them for their expertise as the process went on. I think they did a great job on that and figured out a lot of little details.

The case of Bedrock Analytics added another significant and challenging project to the SHALB portfolio. Our team is friendly, proactive, and true masters of their craft. But what is more important, we understand the space of your project, no matter how complex it might be. Modernize your stack and stand out from the crowd with competitive advantages! Book an online meeting or contact sales@shalb.com for more information.