January 9, 2026

GraphQL in Action: Building an API with .NET Core

Introduction: What is GraphQL?

GraphQL is:

  • A query language used to request data
  • A runtime that executes those queries
  • An API design that usually works on one single endpoint
Instead of returning a fixed response like REST, the API provides flexible data fetching, allowing the client to request exactly what it needs. This makes the API more frontend-friendly and adaptable to changing UI requirements.

Why GraphQL?

Nowadays, applications like React apps, Angular apps, and mobile apps need APIs that are flexible and fast.
Frontend developers usually expect:
  • Only required data
  • Less number of API calls
  • Faster UI development
But when we use traditional REST APIs, we often face problems like:
  • API sends extra data that UI never uses
  • Multiple APIs are needed for one screen
  • Any UI change forces backend API changes
Because of these issues, GraphQL becomes a good solution.
It allows the client (frontend) to decide what data it wants, instead of backend forcing a fixed response.

REST vs GraphQL

REST API

 GET /api/products  
This returns many fields even if UI needs only product name and price.

GraphQL

 query {  
	  products {  
	   name  
	   price  
	  }  
	 }  
	
This returns only name and price, nothing extra.
This is the biggest advantage of GraphQL.

Why GraphQL is Better Than REST (Practical View)

  • UI gets only required data
  • One endpoint works for many clients
  • No unnecessary payload
  • Backend does not change for UI changes
This makes frontend and backend more independent.

When Should You Use GraphQL?

GraphQL is good when:

  • Frontend has heavy UI logic
  • Same backend is used by web and mobile apps
  • UI keeps changing frequently
  • You want to reduce network calls

GraphQL is not ideal when:

  • Application is very simple CRUD
  • API mainly handles file uploads/downloads
  • Strong HTTP caching is a top requirement

Setting Up GraphQL in .NET Core (Working Example)

 Create a New Project

 dotnet new webapi -n GraphQLDemo  
 cd GraphQLDemo  
This creates a basic ASP.NET Core Web API project.

Install Required Packages

 dotnet add package GraphQL  
 dotnet add package GraphQL.Server.Transports.AspNetCore  
 dotnet add package GraphQL.Server.Ui.GraphiQL  

These packages help us:

  • Define GraphQL schema
  • Run GraphQL queries
  • Use GraphQL GraphiQL UI

Product Model

What this code does

  • Creates a simple C# class
  • Represents a product entity

Why this is required

  • GraphQL works with strongly typed objects
  • This model acts as the data source
  • In real projects, this usually maps to a database table

So this model is the base of our GraphQL response.


ProductType (GraphQL Object Type)

What this code does

  • Converts the C# Product class into a GraphQL type
  • Exposes fields like id, name, and price

Why this is required

  • GraphQL does not directly expose C# models
  • You must clearly define which fields are allowed

Security benefit

Only fields defined here can be queried, so sensitive data is automatically protected.

ProductQuery (Query Resolver)

What this code does

  • Creates a GraphQL query named products
  • Defines how product data is fetched
  • Executes resolver logic when query runs

Why this is required

  • GraphQL needs resolver logic to get data
  • This is similar to a controller method in Web API
In real projects, resolver usually calls:
  • Service layer
  • EF Core
  • External APIs
 Resolver acts as a bridge between query and data.

AppSchema (GraphQL Schema)

What this code does

  • Registers all available queries
  • Acts as the main entry point for GraphQL execution

Why this is required

  • GraphQL cannot work without schema
  • Schema defines:
    • Available queries
    • Available mutations (later)
Schema is the contract between frontend and backend.

Registering GraphQL Services (Dependency Injection)

What this code does

  • Registers GraphQL components in ASP.NET Core DI
  • Allows GraphQL to resolve dependencies properly

Why this is required

  • GraphQL.NET depends on dependency injection.
  • Without this:
    • Schema won’t load
    • Queries will fail
This follows normal ASP.NET Core best practices

Add GraphQL Configuration 


What this code does

  • Enables GraphQL in ASP.NET Core
  • Configures JSON serialization

Why this is required

  • GraphQL responses are returned in JSON format
  • Uses System.Text.Json for better performance
  • Registers GraphQL middleware internally
Without this setup, GraphQL endpoint won’t work.

GraphQL Middleware Configuration

What this code does

  • UseGraphQL<ISchema>() exposes /graphql endpoint
  • UseGraphQLGraphiQL() provides UI to test queries

Why this is required

  • GraphQL works over HTTP
  • Middleware connects HTTP request to GraphQL engine
GraphiQL helps developers:
  • Test queries
  • Explore schema
  • Debug responses
GraphiQL should be disabled in production.

GraphQL Query Example

 query {  
  products {  
   id  
   name  
   price  
  }  
 }  

What happens here

  • Client requests only required fields
  • products resolver is executed

Why this is powerful

  • No over-fetching
  • One backend supports multiple UIs
  • Client controls response format
This is the core strength of GraphQL.

Testing Using GraphQL Playground

Open:

 https://localhost:{port}/ui/playground  

Run the query:

 query {  
  products {  
   id  
   name  
   price  
  }  
 }  

Response

 {  
  "data": {  
   "products": [  
    { "id": 1, "name": "Apple", "price": 120 },  
    { "id": 2, "name": "Banana", "price": 60 }  
   ]  
  }  
 }  

Security Considerations

GraphQL is not secure by default. You must implement:
  • Authentication (JWT / OAuth)
  • Query depth limit
  • Disable schema introspection in production
  • Rate limiting
Security depends on how you implement GraphQL, not GraphQL itself.

Real-World Architecture

Frontend (React / Mobile)
→ GraphQL Query
→ Resolver
→ Service Layer
→ Database
GraphQL acts as a smart data layer between UI and backend.

Conclusion

GraphQL is a strong API solution when:
  • UI changes frequently
  • Multiple clients use the same backend
  • Performance and flexibility are important
But for simple CRUD applications, REST APIs are still a very good and simple choice.


PostgreSQL Major Version Upgrades on Azure: A Terraform-based Approach

Introduction

PostgreSQL 11 has reached its end of life, and Azure recommends upgrading to PostgreSQL 13 or later for enhanced security, improved performance, and long-term support. Unlike minor upgrades, Azure Database for PostgreSQL (Flexible Server) does not support in-place major version upgrades. This makes the upgrade process slightly non-trivial—especially when the server is provisioned using Terraform, and some environments use VNet integration.

In this blog, we’ll walk through:
  • How Azure PostgreSQL upgrades work
  • Why does Terraform recreate the server
  • Multiple migration strategies
  • The exact steps I followed to upgrade PostgreSQL 11 → 13 safely


Existing Setup

My environment had the following characteristics:

  • Azure Database for PostgreSQL – Flexible Server
  • PostgreSQL version: 11
  • SKU: Burstable B1ms (1 vCore, 2 GiB RAM)
  • Storage: 32 GiB
  • Region: Central US
  • Provisioned using Terraform
  • Mixed environments: Some with public access, some with VNet integration
  • Firewall rules restricted to specific IPs

Terraform snippet (simplified):



Important Reality: No In-Place Major Version Upgrade

This is the most critical thing to understand: Azure PostgreSQL Flexible Server does NOT support in-place major version upgrades.

That means:
  • You cannot upgrade PostgreSQL 11 → 13 on the same server
  • Changing version = "13" in Terraform:
  • Deletes the existing PostgreSQL 11 server
  • Creates a brand-new PostgreSQL 13 server
  • All data is lost unless you migrate or restore it manually

 

Terraform makes this very clear: forces replacement. This is not really an upgrade — it’s a rebuild and a migration.

Why This Upgrade Looks Simple — and Why It Isn’t

At first glance, the upgrade appears trivial: version = "13" 

But behind this single line:
  • Azure treats PostgreSQL major versions as immutable
  • Terraform maps this to a ForceNew operation
  • Automated backups are tied to the old server lifecycle
  • Configuration and data do not carry over


What Actually Happens (Timeline)

Understanding the timeline helps avoid surprises:

T-0: PostgreSQL 11 running

  • Applications connected
  • Data live
  • Automated backups available


T-1: Terraform version updated

  • version = "11" → version = "13"
  • Plan shows forces replacement


T-2: Terraform apply

  • PostgreSQL 11 server is deleted
  • Databases and backups disappear


T-3: PostgreSQL 13 server created

  • Empty server
  • Default parameters
  • No firewall rules
  • No databases


T-4: Manual restore

  • Data restored
  • Configuration reapplied
  • Applications reconnect


Available Upgrade Approaches

1. Azure Database Migration Service (DMS)
2. Backup & Restore (pg_dump / pgAdmin)
3. Temporary Public Access

Here we focus on Option 3, which was simple, cost-effective, and acceptable for my downtime window.

Step 1: Take a Backup

I used pgAdmin 4 with a custom format backup.

Why Custom format?
  • Includes schema + data
  • Best compatibility across versions
  • Works cleanly with pg_restore
pg_dump `
  -h myserver.postgres.database.azure.com `
  -U pgsqladmin@myserver `
  -d master_data_service `
  -Fc `
  --sslmode=require `
  -f master_data_service_v11.dump

Step 2: Upgrade PostgreSQL Version via Terraform

In Terraform, change the code: version = "13"
Important: This destroys the PostgreSQL 11 server and creates a new PostgreSQL 13 server with the same name.
Run:
terraform plan
terraform apply

This immediately destroys the PostgreSQL 11 server and creates a new PostgreSQL 13 server with the same name.

Step 3: Restore the Database to PostgreSQL 13

pg_restore `
  -h myserver.postgres.database.azure.com `
  -U pgsqladmin@myserver `
  -d postgres `
  --create `
  -Fc `
  --sslmode=require `
  master_data_service_v11.dump

This:
  • Recreated the database
  • Restored schema and data
  • Worked cleanly from v11 → v13


Step 4: Server Parameters & Configuration
Azure applies default server parameters when a new PostgreSQL server is created.

Key learning:

  • Server parameters are NOT automatically migrated
  • If you changed parameters manually in the portal, you must reapply them


Step 5: VNet-Integrated Environments

For servers with VNet integration:
  • No public endpoint exists
  • Local pgAdmin / pg_dump won’t connect

Available options:
  • Use Azure DMS inside the VNet
  • Use a VM or jumpbox
  • Temporarily enable public access

We temporarily enabled public access with strict /32 firewall rules and disabled it immediately after migration.

Step 6: Validate & Cutover

After restoring:
  • Verified tables, row counts, and extensions
  • Tested application connectivity
  • Updated connection strings where required
  • Disabled public access again for private environments

Cost Considerations
  • PostgreSQL B1ms server: ~$25/month
  • Temporary overlap or migration time: a few dollars
  • Azure DMS (Standard): Often free for migration scenarios
  • Overall upgrade cost: minimal


Key Takeaways

  • Azure PostgreSQL major upgrades are not in-place
  • Terraform recreates the server when version changes
  • Always backup before upgrading
  • Server parameters must be reapplied
  • For VNet setups, plan connectivity carefully
  • PostgreSQL supports direct jump from 11 → 13


Final Thoughts

Upgrading PostgreSQL on Azure requires careful planning, but with the right approach, it can be a predictable and safe process.

If you’re using Terraform:

  • Treat major version upgrades as rebuild + restore
  • Automate as much as possible
  • Test in lower environments first