Azure Fundamentals- Qick Guide

 

Azure Core Deep Dive

From fundamentals to in‑depth services: Functions, Logic Apps, App Service, Service Bus, Event Grid, Key Vault, Blob Storage, Cosmos DB, Event Hub, and more.

☁️ Azure Fundamentals

🌍 Regions & Availability Zones

Azure operates in 60+ regions worldwide. Each region contains multiple datacenters; Availability Zones are physically separate locations within a region, providing high availability and resilience.

📦 Resource Groups

A logical container for related Azure resources. Enables lifecycle management, access control, and cost tracking together.

🛡️ Azure Resource Manager

ARM is the deployment and management service. JSON templates (Bicep) enable infrastructure-as-code, consistent deployments, and role-based access control (RBAC).

💰 Pricing & SLA

Pay-as-you-go, reserved instances, and spot pricing. Services offer SLAs (e.g., 99.95% for VMs) with compounding if multiple availability zones are used.

Key concepts: IaaS (Virtual Machines), PaaS (App Service, Functions), SaaS (Office 365). Azure provides hybrid consistency with on‑prem through Azure Arc and Stack.


Azure Functions is a serverless compute service that runs code on-demand without managing infrastructure. It scales automatically and you pay only for execution time.

🔹 Triggers & Bindings

  • HTTP trigger – REST API, webhooks
  • Timer trigger – CRON schedules
  • Blob/Queue trigger – Storage events
  • Cosmos DB trigger – change feed
  • Service Bus / Event Hub trigger

🔸 Hosting Plans

  • Consumption – auto-scale, pay-per-execution
  • Premium – pre-warmed instances, VNet integration
  • App Service plan – dedicated VMs

Languages: C#, JavaScript, Python, PowerShell, Java, TypeScript, and custom handlers.

// HTTP-triggered Azure Function (Node.js)
module.exports = async function (context, req) {
    const name = req.query.name || (req.body && req.body.name);
    context.res = {
        status: 200,
        body: `Hello ${name || 'Azure'}!`
    };
};

Durable Functions extension enables stateful workflows (function chaining, fan-out/fan-in) using orchestration contexts.

Logic Apps is a cloud-based workflow platform to automate and orchestrate tasks, business processes, and integrations across services using a visual designer.

It provides 200+ connectors (Office 365, Salesforce, SAP, on‑premises via data gateway) and runs on a multi-tenant or ISE (Integration Service Environment).

🔹 Triggers

  • Recurrence (schedule)
  • HTTP request / webhook
  • When a blob is added/modified
  • Service Bus queue message
  • Event Grid resource event

🔸 Actions & Control

  • Condition, Switch, For each, Until
  • Call Azure Functions / APIs
  • Parse JSON, compose, variables
  • Send email, Teams notification

Logic Apps use JSON-based workflow definitions. Example snippet (trigger + condition):

{
  "definition": {
    "$schema": "https://schema.management.azure.com/...",
    "triggers": {
      "When_a_HTTP_request_is_received": {
        "type": "Request",
        "kind": "Http"
      }
    },
    "actions": {
      "Condition": {
        "type": "If",
        "expression": "@equals(triggerBody()?['status'], 'approved')",
        "actions": { "Send_an_email": { ... } }
      }
    }
  }
}

App Service is a fully managed platform for hosting web apps, REST APIs, and mobile backends. Supports .NET, Java, Node.js, Python, PHP, and containers.

Key features: auto-scaling, deployment slots (staging), integrated load balancing, authentication/authorization (Easy Auth), custom domains, and TLS/SSL.

  • Web Apps – modern web applications
  • API Apps – RESTful APIs with Swagger support
  • WebJobs – background processing within the same App Service plan
  • Linux & Windows containers support

App Service Environment (ASE) provides fully isolated and dedicated environment for high-scale secure apps.

# Example: deploy a Node.js app using Azure CLI
az webapp up --name my-unique-app --runtime "NODE:18-lts" --os-type Linux

Service Bus is a fully managed enterprise message broker with queues and publish-subscribe topics. It supports ordered messaging, sessions, dead-lettering, and duplicate detection.

🔹 Queues

Point-to-point communication. Messages are pulled by a single consumer. Supports First-In-First-Out (FIFO) with sessions.

🔸 Topics & Subscriptions

Pub/sub model: multiple subscribers receive a copy of each message based on filter rules (SQL‑like or correlation filters).

Advanced features: scheduled delivery, message deferral, auto-forwarding, and geo‑disaster recovery.

// Send a message to a Service Bus queue (C#)
await using var client = new ServiceBusClient(connectionString);
ServiceBusSender sender = client.CreateSender(queueName);
await sender.SendMessageAsync(new ServiceBusMessage("Order processed"));

Event Grid is a serverless event broker that enables reactive programming using a publish-subscribe model. It routes events from any source to any destination with native integration for Azure services and custom topics.

Sources (publishers): Blob Storage, Resource Groups, IoT Hub, Service Bus, custom applications via HTTP.

Handlers (subscribers): Azure Functions, Logic Apps, WebHooks, Event Hubs, Service Bus queues.

Supports event filtering by event type, subject prefix/suffix, and advanced filters. Dead-lettering for undelivered events.

{
  "topic": "/subscriptions/.../resourceGroups/...",
  "subject": "/blobServices/default/containers/images/blobs/photo.png",
  "eventType": "Microsoft.Storage.BlobCreated",
  "data": { "api": "PutBlockList", "contentType": "image/png" },
  "dataVersion": "1.0",
  "metadataVersion": "1"
}

Key Vault safeguards cryptographic keys, secrets, and certificates. It uses HSMs (Hardware Security Modules) for key protection and provides centralized management.

  • Secrets – connection strings, API keys, passwords
  • Keys – encryption keys (RSA, EC) for BYOK scenarios
  • Certificates – TLS/SSL certs with auto‑renewal
  • Access policies & RBAC for fine-grained permissions
  • Soft-delete and purge protection

Apps authenticate using managed identities or service principals to retrieve secrets without storing credentials in code.

# Retrieve a secret via Azure CLI
az keyvault secret show --vault-name mykv --name dbpassword --query value -o tsv

Blob Storage is massively scalable object storage for unstructured data: documents, images, backups, logs, and data lakes.

Blob types: Block blobs (text/binary), Append blobs (logging), Page blobs (VHD disks).

Access tiers: Hot (frequent), Cool (infrequent, 30 days), Cold (90 days), Archive (180+ days). Lifecycle management policies automate tiering.

Features: static website hosting, Azure CDN integration, versioning, immutable storage (WORM), and Azure Data Lake Storage Gen2 for big data analytics.

// Upload a blob using .NET SDK
BlobClient blobClient = container.GetBlobClient("documents/report.pdf");
await blobClient.UploadAsync(filePath, overwrite: true);

Cosmos DB is a fully managed, globally distributed, multi-model database. It offers turnkey global replication, single-digit millisecond latency, and elastic scalability.

APIs / models: SQL (Core), MongoDB, Cassandra, Gremlin (graph), Table (key-value), and PostgreSQL.

Consistency levels: Strong, Bounded staleness, Session, Consistent prefix, Eventual – choose the right balance between performance and consistency.

Automatically indexes every property; supports change feed for event-driven architectures. Serverless and provisioned throughput options.

// Query using SQL API (JavaScript SDK)
const { resources } = await container.items
    .query("SELECT * FROM c WHERE c.category = 'electronics'")
    .fetchAll();

Event Hubs is a hyper-scale telemetry ingestion service that can process millions of events per second. It's the front door for big data pipelines and real-time analytics.

Partitions: events are distributed across partitions; consumers read from partition offsets. Throughput units (or premium processing units) manage capacity.

Capture feature automatically stores streaming data in Blob Storage or Data Lake.

Common use: IoT telemetry, clickstream analytics, log aggregation, and integration with Azure Stream Analytics, Spark, or Functions.

// Send events using Event Hub producer client (Python)
import asyncio
from azure.eventhub.aio import EventHubProducerClient
from azure.eventhub import EventData

async def run():
    producer = EventHubProducerClient.from_connection_string(conn_str, eventhub_name="myhub")
    async with producer:
        event_data = EventData('{"temperature": 25.6}')
        await producer.send_batch([event_data], partition_key="device1")

Post a Comment

Previous Post Next Post