AI with Hono

Me

Workers Tech Talks

Workers Tech Talks

Hono Conference

Hono Conferece

Honoの現状

SS

SS

リンク

使ってるところ例

May products

Cloudflare社内

AI with Hono

アジェンダ

  1. Hono CLIによるAIへの取り組み
  2. MCPをHonoが支える
  3. HonoでAIアプリをつくる
  4. AI時代にフレームワークは必要か?

1. Hono CLIによるAIへの取り組み

Hono CLI

全く新しいコンセプト

honoコマンド

hono --help

5つのサブコマンド

# Show help
hono --help
 
# Display documentation
hono docs
 
# Search documentation
hono search middleware
 
# Send request to Hono app
hono request
 
# Start server
hono serve
 
# Generate an optimized Hono app
hono optimize

hono docs

hono docs [path]
hono docs /docs/api/context
hono search <query>
hono search middleware

AIが使うと⋯

SS

hono request

hono request [file]
hono request src/index.ts

いくつかのオプション

hono request \
  -P /api/usrs \
  -X POST \
  -d '{"name":"Alice"}' \
  src/index.ts

アプリのテストに便利

サーバーを立ち上げる

wrangler dev
curl http://localhost:8787

サーバーを立ち上げなくていい

hono request

つまり⋯AIが使うと

  1. hono search - ドキュメントを検索
  2. hono docs - ドキュメントを読む
  3. hono request - アプリをテストする

CLAUDE.mdやAGENTS.mdに書くとよい

### Workflow

1. Search documentation: `hono search <query>`
2. Read relevant docs: `hono docs [path]`
3. Test implementation: `hono request [file]`

hono serve

hono serve src/index.ts

http://localhost:7070 でサーバーが立ち上げる

--use オプション

hono serve \
  --use "logger()" \
  src/inidex.ts

複雑な例

hono serve \
  --use "logger" \
  --use "basicAuth({username:'foo',password:'bar'})" \
  src/index.ts

ファイルサーバー

hono serve \
  --use "serveStatic({root:'./'})"

プロキシ

hono serve \
  --use '(c) => proxy(`https://ramen-api.dev${new URL(c.req.url).pathname}`)'

hono optimize

hono optimize [entry]

最適化

import { Hono } from 'hono'
 
const app = new Hono()
 
app.get('/', async (c) => {
  return c.json({ message: 'Hello' })
})
 
app.get('/health', (c) => c.text('OK'))
 
export default app

結果

SS

おさらい

5つのサブコマンド

# Show help
hono --help
 
# Display documentation
hono docs
 
# Search documentation
hono search
 
# Send request to Hono app
hono request
 
# Start server
hono serve
 
# Generate an optimized Hono app
hono optimize

Workers Fetch

# GET request (auto-detects wrangler.json, wrangler.jsonc, or wrangler.toml)
workers-fetch
 
# With path
workers-fetch /api/users
 
# POST request
workers-fetch -X POST -H "Content-Type:application/json" -d '{"name":"test"}' /api/users
 
# Custom config file
workers-fetch -c wrangler.toml /api/test
 
# With timeout (5 seconds)
workers-fetch --timeout 5 /api/slow

wrangler run?

SS

https://x.com/yusukebe/status/2011974518026485767/quotes

VS Code拡張

SS

Skills

CloudflareのSkill:

2. MCPをHonoが支える

HonoとMCP

AI系ライブラリがHonoを使っている

MCP SDKでHonoが使われだした

MCP SDKの遷移

SS

https://github.com/modelcontextprotocol/typescript-sdk/issues/260

基本的なシンタックス

const transport = new WebStandardStreamableHTTPServerTransport()
 
app.all('/mcp', async (c) => {
  return transport.handleRequest(c.req.raw)
})

@modelcontextprotocol/hono

import { McpServer, WebStandardStreamableHTTPServerTransport } from '@modelcontextprotocol/server'
import { createMcpHonoApp } from '@modelcontextprotocol/hono'
 
const server = new McpServer({ name: 'my-server', version: '1.0.0' })
const transport = new WebStandardStreamableHTTPServerTransport({ sessionIdGenerator: undefined })
await server.connect(transport)
 
const app = createMcpHonoApp()
app.all('/mcp', (c) => {
  return transport.handleRequest(c.req.raw, { parsedBody: c.get('parsedBody') })
})

@modelcontextprotocol/node

// https://github.com/modelcontextprotocol/typescript-sdk/blob/main/packages/middleware/node/src/streamableHttp.ts
import { getRequestListener } from '@hono/node-server';
 
//...
 
async handleRequest (
  req: IncomingMessage & { auth?: AuthInfo },
  res: ServerResponse,
  parsedBody?: unknown
): Promise<void> {
  // Store context for this request to pass through auth and parsedBody
  // We need to intercept the request creation to attach this context
  const authInfo = req.auth
 
  // Create a custom handler that includes our context
  // overrideGlobalObjects: false prevents Hono from overwriting global Response, which would
  // break frameworks like Next.js whose response classes extend the native Response
  const handler = getRequestListener(
    async (webRequest: Request) => {
      return this._webStandardTransport.handleRequest(webRequest, {
        authInfo,
        parsedBody
      })
    },
    { overrideGlobalObjects: false }
  )
 
  // Delegate to the request listener which handles all the Node.js <-> Web Standard conversion
  // including proper SSE streaming support
  await handler(req, res)
}

それまでのアプローチ

@hono/mcp

import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js'
import { StreamableHTTPTransport } from '@hono/mcp'
import { Hono } from 'hono'
import { mcpServer } from './mcp-server'
 
const app = new Hono()
 
// Initialize the transport
const transport = new StreamableHTTPTransport()
 
app.all('/mcp', async (c) => {
  if (!mcpServer.isConnected()) {
    // Connect the mcp with the transport
    await mcpServer.connect(transport)
  }
 
  return transport.handleRequest(c)
})

これからはMCP SDKでできる

pkg.pr.newのバージョンを使う:

bun add https://pkg.pr.new/modelcontextprotocol/typescript-sdk/@modelcontextprotocol/server@1326

mcp-server.ts:

// mcp-server.ts
import { McpServer } from '@modelcontextprotocol/server'
import * as z from 'zod/v4'
 
export const mcpServer = new McpServer({
  name: 'simple-server',
  version: '0.0.1'
})
 
mcpServer.registerTool(
  'add',
  {
    title: 'Add a to b',
    inputSchema: { a: z.number(), b: z.number() }
  },
  async ({ a, b }) => {
    return {
      content: [{ type: 'text', text: `${a + b}` }]
    }
  }
)

index.ts:

// index.ts
import { Hono } from 'hono'
import { WebStandardStreamableHTTPServerTransport } from '@modelcontextprotocol/server'
import { mcpServer } from './mcp-server'
 
const app = new Hono()
 
const transport = new WebStandardStreamableHTTPServerTransport()
 
app.all('/mcp', async (c) => {
  if (!mcpServer.isConnected()) {
    await mcpServer.connect(transport)
  }
  return transport.handleRequest(c.req.raw)
})
 
export default app

インスペクタ:

DANGEROUSLY_OMIT_AUTH=true npx @modelcontextprotocol/inspector

MCP Server for Hono

import { Hono } from 'hono'
import { z } from 'zod'
import { mcp, registerTool } from 'hono-mcp-server'
 
const app = new Hono()
 
app.get('/', (c) => {
  return c.json({ message: 'Hello, MCP Server!' })
})
 
app.post(
  '/hello',
  registerTool({
    description: 'Say hello',
    inputSchema: {
      name: z.string().describe('Your name')
    }
  }),
  (c) => {
    const { name } = c.req.valid('json') // Typed!
    return c.json({ message: `Hello ${name}!` })
  }
)
 
export default mcp(app, {
  name: 'Simple MCP',
  version: '1.0.0'
})

hono-mcp-server

MCP Apps

MCP Apps

3. HonoでAIアプリをつくる

HonoでAIアプリをつくろう

Workers AIを使う

import { Hono } from 'hono'
 
const app = new Hono<{
  Bindings: CloudflareBindings
}>()
 
app.get('/', async (c) => {
  const stream = await c.env.AI.run('@cf/meta/llama-3.3-70b-instruct-fp8-fast', {
    messages: [
      { role: 'system', content: 'You are a ramen master' },
      {
        role: 'user',
        content: 'What is the tonkotsu ramen?'
      }
    ],
    stream: true
  })
  return c.body(stream, 200, {
    'Content-Type': 'text/event-stream'
  })
})
 
export default app

クライアントの実装:

import { stream } from 'fetch-event-stream'
import { stdout } from 'node:process'
 
const events = await stream('http://localhost:8787')
 
for await (let event of events) {
  if (event.data) {
    try {
      const data = JSON.parse(event.data)
      stdout.write(data.response)
    } catch {}
  }
}

その他のAPIも叩ける

Replicateの例

import { Hono } from 'hono'
import Replicate from 'replicate'
 
const app = new Hono()
 
app.get('/', async (c) => {
  const replicate = new Replicate()
  const output = await replicate.run('google/imagen-4', {
    input: {
      prompt: 'A mascot of Hono framework flying in the sky'
    }
  })
  return c.body(output as ReadableStream)
})
 
export default app

チャットアプリの例

手順

Cloudflare + Viteプロジェクトの作成

Create-honoを使い、cloudflare-workers+viteを選ぶ:

bun create hono@latest my-chat

LLMを叩く

wrangler.jsoncのBindingsにAIを足す:

{
  "$schema": "node_modules/wrangler/config-schema.json",
  "name": "my-chat",
  "compatibility_date": "2025-08-03",
  "main": "./src/index.tsx",
  "ai": {
    "binding": "AI",
    "remote": true
  }
}

型定義を書き出す:

bun run cf-typegen

ジェネリクスに渡す:

// src/index.tsx
const app = new Hono<{ Bindings: CloudflareBindings }>()

Workers AIのLLMを叩いてストリームを返す:

// src/index.tsx
app.get('/stream', async (c) => {
  const stream = await c.env.AI.run('@cf/meta/llama-3.3-70b-instruct-fp8-fast', {
    messages: [{ role: 'user', content: 'Hello!' }],
    stream: true
  })
  return c.body(stream, 200, {
    'Content-Type': 'text/event-stream'
  })
})

スキーマ・バリデーション

messagesを受け取るためのスキーマ定義:

// src/index.tsx
import * as z from 'zod'
 
// ...
 
const schema = z.object({
  messages: z.array(
    z.object({
      role: z.enum(['system', 'user', 'assistant']),
      content: z.string()
    })
  )
})

POSTで受け取り、バリデーションする:

// src/index.tsx
import { zValidator } from '@hono/zod-validator'
 
// ...
 
app.post('/stream', zValidator('json', schema), async (c) => {
  const data = c.req.valid('json')
  const stream = await c.env.AI.run('@cf/meta/llama-3.3-70b-instruct-fp8-fast', {
    messages: data.messages,
    stream: true
  })
  return c.body(stream, 200, {
    'Content-Type': 'text/event-stream'
  })
})

SSRレンダリング

トップページをSSRで作る:

// src/index.tsx
app.get('/', (c) => {
  return c.render(
    <div id="chat-container">
      <div id="messages"></div>
      <form id="chat-form">
        <textarea id="input" placeholder="Type a message..." rows={3}></textarea>
        <button type="submit">Send</button>
      </form>
    </div>
  )
})

クライアント

vite-ssr-componentsからScriptをインポートしてクライアントファイルを指定:

// src/renderer.tsx
import { jsxRenderer } from 'hono/jsx-renderer'
import { Script, Link, ViteClient } from 'vite-ssr-components/hono'
 
export const renderer = jsxRenderer(
  ({ children }) => {
    return (
      <html>
        <head>
          <ViteClient />
          <Script src="/src/client.ts" />
          <Link href="/src/style.css" rel="stylesheet" />
        </head>
        <body>{children}</body>
      </html>
    )
  },
  { stream: true }
)

client.tsを作成する:

// src/client.ts
/// <reference lib="dom" />
/// <reference lib="dom.iterable" />

ストリームのパースにfetch-event-streamを使うと便利かも:

// src/client.ts
import { events } from 'fetch-event-stream'

AIに任せる

あとはAIに任せる。

4. AI時代にフレームワークは必要か?

AI時代のフレームワーク

HonoとAIのベストプラクティス

Developers Summitの宣伝

Developers Summit

余談 - このWebページもHonoで出来てる

.
├── app
│   ├── client.ts
│   ├── global.d.ts
│   ├── routes
│   │   ├── _404.tsx
│   │   ├── _error.tsx
│   │   ├── _renderer.tsx
│   │   └── index.mdx
│   ├── server.ts
│   └── style.css
├── package.json
├── public
│   └── favicon.ico
├── tsconfig.json
├── vite.config.ts
└── wrangler.jsonc

まとめ

Honoを使ってAIをすることについて話してきました。

  1. Hono CLIによるAIへの取り組み
  2. MCPをHonoが支える
  3. HonoでAIアプリをつくる
  4. AI時代にフレームワークは必要か?