Azure Cosmos DB is a globally distributed, multi-model, NoSQL database service that allows us to build highly available and scalable applications. Cosmos DB supports applications that use Document model data through it’s SQL API and MongoDB API.
I’ve been meaning to produce more content on Cosmos DB’s Mongo API, so in this article, I’m going to be developing a Serverless API in Azure Functions that uses a Cosmos DB MongoDB API account. This article has been loosely based on this fantastic tutorial on creating a Web API with ASP.NET Core and MongoDB.
By the end of this article, you’ll know…
The ability to perform backups on your data is essential to ensure that you can recover in the event of any data failure, such as data corruption, human-error or datacenter failure.
Azure Cosmos DB takes backups of your data automatically at regular intervals without affecting the performance or availability of our database operations. This is great for when we encounter any of the data failures mentioned above.
There are two options available to us when we need to perform backups in Azure Cosmos DB:
Back in January 2020, I wrote on article on Azure Cosmos DB’s ‘Autopilot’ mode that they released in November 2019, which was still in preview at the time of writing.
Not only has this feature gone GA (Generally Available), but it also has a much better name (In my opinion), Autoscale!
In this article, I will cover the following topics:
I’m currently working on a personal project that reads various different csv files from a local directory, uploads the files to Azure Blob Storage and then persists the records of each file into Azure Cosmos DB. These are downloaded files from my Fitbit dashboard.
At a high level, The architecture looks like this:
Azure provides us with it’s own implementation of Redis called Azure Cache for Redis. This is an in-memory data store that helps us improve the performance and scalability of our applications. We’re able to process a large amount of application requests by keeping the most frequently accessed data in our server memory so that it can be written to and read from quickly.
Azure Cache for Redis is a managed service and it provides secure Redis server instances with full Redis API compatibility. With Azure Redis Cache, we can use it for the following scenarios:
Caching our data
In the process of refactoring some of our microservices at work, I came across a service that didn’t have any unit tests for them! This service uses the Azure Cosmos DB Change Feed to listen to one of our write-optimized containers related to customers. If a new customer is created in that container, we then pick up that Customer document and insert it into a read-optimized container (acting as an aggregate store) which has a read friendly partition key value.
This read-optimized container is then utilized by other services within our pipeline when we need to query the aggregate for…
In the past, performing traditional analytical workloads with Azure Cosmos DB has been a challenge. Workarounds such as manipulating the amount of provisioned through and using the Change Feed or any other ETL mechanism to migrate data from Cosmos DB to platforms more suited to performing analytics on our data do exist, but are a challenge to develop and maintain.
Azure Synapse Link for Cosmos DB addresses the needs to perform analytics over our transactional data without impacting our transactional workloads. This is made possible through the Azure Cosmos DB Analytical store, which allows to sync our transactional data into…
At Build 2020, the Azure Cosmos DB team announced that they were working on a ‘Serverless’ preview for Cosmos DB, allowing developers to provision Cosmos DB accounts that only use throughput when operations are performed on that Cosmos DB account, instead of having to provision throughput at a constant rate.
Out of absolutely nowhere (it’s been a hell of a week!), the Cosmos DB team announced today that the Serverless preview has been released for the Core API (or SQL)! You can head to Azure right now and provision Cosmos DB accounts without having to provision any throughput.
If we want to store information such as documents, images or videos, Azure Blob Storage is a great storage solutions that provides us as developers with high availability at a low price.
For the AZ-204 exam, we’ll need to know how to do the following:
But first we’ll need to set up a couple of storage accounts in Azure to follow…
In four weeks time, I plan to take my AZ-204 exam. I’ve been working with Azure in my day-to-day job for almost 2 years now, but taking a look at the exam skills outline, there are a few technologies on there that I’ve never touched before.
So in order to help my preparation for the exam, I plan to write a series of blog posts on each skill measured. Hopefully this will not just reinforce my learning, but also help others who may be taking AZ-204 themselves, or just want to learn something new.
In this article, I’m going to…