App Modernization with NoSQL Hackathon

Instructor-led Training

Course Description

This Hack enables attendees to learn how to implement mission-critical data solutions for modern applications.

This Hack simulates a real-world scenario where developers need to migrate a monolithic legacy application to a cloud-based NoSQL environment, taking advantage of microservices and event sourcing.

During the “hacking” attendees will focus on:

  1. Migrating an existing on-premise database to the cloud (using NoSQL as data solution) and then collecting and storing real-time data in the new NoSQL data platform.
  2. Optimizing the NoSQL database through modeling, implementing a good partition strategy, indexing, denormalization, creating materialized views, and tuning throughput for varying workloads. Next, attendees will implement an event sourcing pattern to optimize, scale, and distribute the processing of streaming data in realtime.
  3. Expanding search capabilities by enabling full-text, fuzzy, and faceted searches against the data store.
  4. Configuring replicas across multiple regions worldwide.

By the end of the Hack, attendees will have built out a technical solution that incorporates several complementary Azure services for stream processing, data storage, and visualization. These services work together to create an end-to end modern, flexible, and scalable cloud-native NoSQL data solution to store, process, and access any required business data.

About this Course

Challenge 1: To the cloud  
In this challenge, you will provision the NoSQL database. 
Learning objectives: 
•    Provisioning a NoSQL database in Azure, using the following characteristics:  
o    Enables a simplified process for scaling up and down 
o    Supports the event sourcing pattern where changes to the data store trigger events that can be processed by any number of listening components in near real-time 
o    Supports a flexible schema with a multi-region, global distribution 
•    Store any arbitrary record in the database 

Challenge 2: Migrating the database to the cloud  
In this challenge, you will migrate all data to the new database. 
Learning objectives: 
•    Create a repeatable process to migrate from the supplied SQL database to the selected NoSQL database, validate the migration through queries, and explain to the coach how the database can be scaled. 

Challenge 3: Optimize NoSQL design 
In this challenge, you will implement optimizations and demonstrate an improvement in query performance, and/or cost per query.  
Learning objectives: 
•    Estimate the cost per query for reads and writes, as well as query performance 
•    Use best practices for optimizing the database after evaluating common queries and workloads, then show a measurable improvement after optimization is complete 
•    Attendees may need to migrate their data once again 

Challenge 4: Events are the center of our universe 
In this challenge, you will add new features to the solution to support the event sourcing pattern and create a report dashboard.  
Learning objectives: 
•    Create a caching layer for a subset of the data 
•    Use the event sourcing pattern on order data that flows into the database 
•    Use these events to create materialized views, real-time dashboards, and cache invalidation 

Challenge 5: Improving the search experience 
In this challenge, you will implement full-text searching capabilities by creating an index on the title and description fields and add other filters to help users quickly narrow the results.  
Learning objectives: 
•    Enable full-text, fuzzy, and faceted searching capabilities on the database 

Challenge 6: Taking over the world (MUAHAHAHA) 
In this challenge, you will create a new node or replica of the NoSQL data store within a new Azure region. 
Learning objectives: 
•    Add the NoSQL database to a new region with full replication between both regions •     Help the customer meet data durability and low latency objectives 

 

3 Days

To be successful and get the most out of this OpenHack, participants should have familiarity with database concepts such as data modeling, partitioning, denormalization, and indexing. Prior experience with NoSQL databases and familiarity with relational data structures is helpful, but not required.  Required knowledge of Azure fundamentals.  Language-Specific. 

  • Recommended that participants have previous experience with knowledge of programing languages including C#, JavaScript, Node.JS or Java.

Advanced

  • Cosmos DB
  •  Azure SQL Database
  • Azure Data Factory
  • Azure Functions
  • Azure Event Hubs
  • Azure Stream Analytics
  • Power BI,
  • Azure Cognitive Search. 
  • Target Audience:  o Microsoft – CSE, CSA, GBB, ATT, SE, TPM o Customer –  
  • Target verticals: Cross-Industry
  • Customer profile: Customers looking to migrate from on-prem database to a cloud-based NoSQL environment. 
 
  • Migration to NoSQL – given an existing web application and SQL database, initially perform a raw migration to a Cosmos DB or other NoSQL database in Azure, creating a repeatable process with various tools and services
  • NoSQL data modeling and optimization – Evaluating a relational data store, then adapting the schema to a data model, optimized for both write-heavy and read-heavy workloads in NoSQL. Optimization includes combining models as necessary within the same collections, denormalization and embedding, implementing an appropriate indexing strategy for the workloads and query patterns, and selecting an optimal partition strategy
  • Event sourcing – Reacting to data events, such as inserts and updates, enabling scenarios such as populating real-time dashboards, creating materialized views (aggregation), and automatic cache invalidation
  • Advanced visualizations – UI components that show both real-time and batch data with little to no impact to NoSQL database performance
  • Expanding search capabilities – Reaching beyond native indexing and search capabilities provided by the NoSQL database, through an external search service that enables full-text, fuzzy (can handle misspellings), and faceted search against the data catalog
  • Global reach – Adding high availability and low latency by replicating data across geographical regions, bringing data closer to users and deployed services
 

Need to Train a Team?

Contact us to schedule a dedicated class for your team.