The End of Front-End

Delivering Software via LLMs and the Model Context Protocol

pip install rse-vibe-coding

👋 Cristian Dinu, UCL


The RSE Friction

RSE as Tool Builders…

  • Endless UIs
  • Endless docs
  • Endless support

RSE as Tool Users…

  • Endless wrappers
  • Endless glue code
  • Endless boilerplate

What if your software just… understood you?

The Chasm

An illustration of a cracked chasm running vertically through the middle. On the left side, a blue electric car icon with the text 'What is the cleanest time to charge my car?' On the right side, a code window icon with the text '/api/v1/forecast?region=04'. The chasm visually separates the plain-language question from the technical API call.

There is something that can:

  • understand natural language
  • understand intent
  • could generate a list of commands… if it only had the tools
A cartoon-style drawing of a glass jar with a smiling brain inside. The jar is labeled 'LLM' in block letters.
A block diagram showing a chat application containing an LLM (GPT-4o, DeepSearch, Ollama).
A block diagram of a chat application with an LLM using custom tools for web search and code sandbox, connected to a cloud.
A block diagram showing an MCP host with an LLM and MCP client connected to multiple MCP servers for carbon intensity, Vaillant Cloud, and a local smart home.
A clean diagram showing an LLM icon on the left, connected via MCP (Model Context Protocol) in the center, to multiple services on the right (Slack, Google Drive, GitHub). Each service has the same flow: Discover, Parse, Respond. Leonis Capital logo is in the corner.

MCP is a Contract

A simple schema defines your tool’s capabilities.

{
  name: fetch,
  description: `Fetches a URL from the internet...`
  inputSchema:{
      url: {
        description: URL to fetch,
        format: uri,
        type: string
      },
      max_length: {
        default: 5000,
        description: Maximum number of characters to return.,
        type: integer
      },
      ...
  }
} 

Live Demo: The Setup

  • Client: Any LLM chat interface (VSCode, Claude, Cursor, etc.)
  • Server: A Python app exposing our tools via MCP.
An infographic showing a cracked chasm between a car charging question on the left and an API code icon on the right, bridged by a curved arch labeled “MCP.”

Creating our own MCP Server

  • Carbon Intensity API – api.carbonintensity.org.uk
  • Prerequisites:
    • uv - python project manager
    • nodejs - JavaScript interpreter (optional, for inspector tool)
  • We need to think about semantic intent

Initialize the project

uv init intensity_mcp
cd intensity_mcp

uv add fastmcp requests

(code at: github.com/cdinu/uk-carbon-intensity-mcp-py)

Write the code – main.py

import requests

def intensity_current():
    """Fetches the current carbon intensity of the UK electricity grid.
    Returns a JSON with the current carbon intensity in gCO2eq/kWh."""
    response = requests.get("https://api.carbonintensity.org.uk/intensity")
    response.raise_for_status()
    data = response.json()
    return data["data"]

import requests
from fastmcp import FastMCP

def intensity_current():
    """Fetches the current carbon intensity of the UK electricity grid.
    Returns a JSON with the current carbon intensity in gCO2eq/kWh."""
    response = requests.get("https://api.carbonintensity.org.uk/intensity")
    response.raise_for_status()
    data = response.json()
    return data["data"]

import requests
from fastmcp import FastMCP

mcp = FastMCP("CarbonIntensityMCP")

def intensity_current():
    """Fetches the current carbon intensity of the UK electricity grid.
    Returns a JSON with the current carbon intensity in gCO2eq/kWh."""
    response = requests.get("https://api.carbonintensity.org.uk/intensity")
    response.raise_for_status()
    data = response.json()
    return data["data"]

import requests
from fastmcp import FastMCP

mcp = FastMCP("CarbonIntensityMCP")

@mcp.tool(name="Get Current Intensity")
def intensity_current():
    """Fetches the current carbon intensity of the UK electricity grid.
    Returns a JSON with the current carbon intensity in gCO2eq/kWh."""
    response = requests.get("https://api.carbonintensity.org.uk/intensity")
    response.raise_for_status()
    data = response.json()
    return data["data"]

if __name__ == "__main__":
    mcp.run()

Add parameters, too:

@mcp.tool(name="Read Carbon Intensity for Dates and Postcode")
def intensity_for_dates_and_postcode(from_datetime: str, to_datetime: str,
    postcode: str,
):
    """Fetches electricity grid carbon intensity data for a specific UK postcode and time range.
    The `from_datetime` and `to_datetime` should be in ISO 8601 format (e.g. 2018-05-15T12:00Z).
    The `postcode` needs only the first part e.g. RG10 (without the last three characters or space)
    Returns a summary including average forecast and generation mix. Dates returned are UTC. Units are gCO2eq/kWh.
    """
    url = f"https://api.carbonintensity.org.uk/regional/intensity/{from_datetime}/{to_datetime}/postcode/{postcode}"
    try:
        response = requests.get(url)
        response.raise_for_status()
        return response.json()["data"]

    except Exception as e:
        return f"Failed to retrieve carbon intensity data: {str(e)}"

Configure your MCP client

A screen recording showing how to add a custom MCP server in the VSCode LLM chat interface.

View available tools

A screen recording showing how to view available tools in the VSCode LLM chat interface.

Call a tool

A screen recording showing how to call the 'Get Current Intensity' tool in the VSCode LLM chat interface.

Call another tool

A screen recording showing how LLM creates a CSV

The Real Magic: Orchestration

A screen recording showing how LLM finds a 3-hour window to charge an EV and books it in the calendar.

What Just Happened?

  1. We provided capabilities
  2. User stated intent
  3. The LLM built the workflow on the fly

RSE Superpowers Unlocked 🚀

For Tool Builders… For Tool Users…
✅ Escape the UI Treadmill ✅ Query by Intent
✅ Slash Support Load ✅ Zero Boilerplate
✅ Instant “Feature” Rollout ✅ Your Personal Orchestrator

The Future is a Conversation

  • Natural Language is the universal flexible front-end “programming language”.
  • APIs are the stable, reliable backend.
  • MCP is the contract that connects them.

What will you build?

  • The front-end isn’t disappearing.
  • It’s becoming invisible, adaptive, and conversational.

Thank You!

Cristian Dinu

https://cdi.nu

Thank you!

https://cdi.nu/talks/2025-09-RSECon