linear-performance-tuning
Optimize Linear API queries and caching for better performance. Use when improving response times, reducing API calls, or implementing caching strategies. Trigger with phrases like "linear performance", "optimize linear", "linear caching", "linear slow queries", "speed up linear". allowed-tools: Read, Write, Edit, Grep version: 1.0.0 license: MIT author: Jeremy Longshore <jeremy@intentsolutions.io>
Allowed Tools
No tools specified
Provided by Plugin
linear-pack
Claude Code skill pack for Linear (24 skills)
Installation
This skill is included in the linear-pack plugin:
/plugin install linear-pack@claude-code-plugins-plus
Click to copy
Instructions
# Linear Performance Tuning
## Overview
Optimize Linear API usage for maximum performance and minimal latency.
## Prerequisites
- Working Linear integration
- Understanding of GraphQL
- Caching infrastructure (Redis recommended)
## Instructions
### Step 1: Query Optimization
**Minimize Field Selection:**
```typescript
// BAD: Fetching unnecessary fields
const issues = await client.issues();
for (const issue of issues.nodes) {
// Only using id and title, but fetching everything
console.log(issue.id, issue.title);
}
// GOOD: Request only needed fields
const query = `
query MinimalIssues($first: Int!) {
issues(first: $first) {
nodes {
id
title
}
}
}
`;
```
**Avoid N+1 Queries:**
```typescript
// BAD: N+1 queries
const issues = await client.issues();
for (const issue of issues.nodes) {
const state = await issue.state; // Separate query per issue!
console.log(issue.title, state?.name);
}
// GOOD: Use connections and batch loading
const query = `
query IssuesWithState($first: Int!) {
issues(first: $first) {
nodes {
id
title
state {
name
}
}
}
}
`;
```
### Step 2: Implement Caching Layer
```typescript
// lib/cache.ts
import Redis from "ioredis";
const redis = new Redis(process.env.REDIS_URL);
interface CacheOptions {
ttlSeconds: number;
keyPrefix?: string;
}
export class LinearCache {
private keyPrefix: string;
private defaultTtl: number;
constructor(options: CacheOptions = { ttlSeconds: 300 }) {
this.keyPrefix = options.keyPrefix || "linear";
this.defaultTtl = options.ttlSeconds;
}
private key(key: string): string {
return `${this.keyPrefix}:${key}`;
}
async get(key: string): Promise {
const data = await redis.get(this.key(key));
return data ? JSON.parse(data) : null;
}
async set(key: string, value: T, ttl = this.defaultTtl): Promise {
await redis.setex(this.key(key), ttl, JSON.stringify(value));
}
async getOrFetch(
key: string,
fetcher: () => Promise,
ttl = this.defaultTtl
): Promise {
const cached = await this.get(key);
if (cached) return cached;
const data = await fetcher();
await this.set(key, data, ttl);
return data;
}
async invalidate(pattern: string): Promise {
const keys = await redis.keys(this.key(pattern));
if (keys.length) {
await redis.del(...keys);
}
}
}
export const cache = new LinearCache({ ttlSeconds: 300 });
```
### Step 3: Cached Client Wrapper
```typescript
// lib/cached-client.ts
import { LinearClient } from "@linear/sdk";
import { cache } from "./cache";
export class CachedLinearClient {
private client: LinearClient;
constructor(apiKey: string) {
this.client = new LinearClient({ apiKey });
}
async getTeams() {
return cache.getOrFetch(
"teams",
async () => {
const teams = await this.client.teams();
return teams.nodes.map(t => ({ id: t.id, name: t.name, key: t.key }));
},
3600 // Teams rarely change, cache for 1 hour
);
}
async getWorkflowStates(teamKey: string) {
return cache.getOrFetch(
`states:${teamKey}`,
async () => {
const teams = await this.client.teams({
filter: { key: { eq: teamKey } },
});
const states = await teams.nodes[0].states();
return states.nodes.map(s => ({
id: s.id,
name: s.name,
type: s.type,
}));
},
3600 // States rarely change
);
}
async getIssue(identifier: string, maxAge = 60) {
return cache.getOrFetch(
`issue:${identifier}`,
async () => {
const issue = await this.client.issue(identifier);
const state = await issue.state;
return {
id: issue.id,
identifier: issue.identifier,
title: issue.title,
state: state?.name,
priority: issue.priority,
};
},
maxAge
);
}
// Invalidate cache when we know data changed
async createIssue(input: any) {
const result = await this.client.createIssue(input);
await cache.invalidate("issues:*");
return result;
}
}
```
### Step 4: Request Batching
```typescript
// lib/batcher.ts
interface BatchRequest {
key: string;
resolve: (value: T) => void;
reject: (error: Error) => void;
}
class RequestBatcher {
private queue: BatchRequest[] = [];
private timeout: NodeJS.Timeout | null = null;
private batchSize: number;
private delayMs: number;
private batchFetcher: (keys: string[]) => Promise