diff --git a/AGENTS.md b/AGENTS.md index e0e7ee30..9d9d54c2 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -185,9 +185,13 @@ When touching transaction and position flows, validation MUST include all releva 47. **Indexed event-history pagination integrity**: when stitching user history from raw indexer event tables, page each table with a stable unique order (include an event-unique tie-breaker such as indexed row `id`, not just `timestamp` or `txHash`), freeze the page window against new-head inserts during pagination, dedupe API pages by source event identity, and reconcile local receipt caches to indexed history with a cross-source event merge key (`hash + type + market + assets + shares`, or stronger) rather than raw tx-hash presence. 48. **Paged history source consistency integrity**: when a paged transaction/history read falls back from a primary provider, choose the fallback provider once per query and keep it fixed across all pages. Do not mix providers page-by-page, and do not silently return truncated history when page caps are hit; fail closed with an explicit error instead. 49. **Merged-history pagination query integrity**: when a UI paginates over a merged transaction/history dataset that is fetched as a full filtered set (for example because the backend cannot page a merged multi-table stream correctly), keep page changes local to the UI and do not include `skip`/`first` or current-page state in the upstream query key. Fetch once per filter scope, then slice locally, so pagination controls do not re-trigger the full upstream history cycle on every click. -50. **Market-registry fallback integrity**: shared multi-chain market registry reads must preserve per-chain isolation across providers. If a chain-scoped Morpho market-list request fails, fall back to a chain-scoped Monarch/Envio market-list query before subgraph, keep the fallback fully paginated, and apply the same canonical market guards (for example non-zero collateral token and non-zero IRM) so one broken chain such as HyperEVM does not blank the registry for healthy chains. +50. **Market-registry fallback integrity**: shared multi-chain market registry reads must preserve per-chain isolation across providers. The primary path may use one shared Monarch/Envio market-list query for first paint, but any chain missing from that result must fall back independently to Morpho per chain and then subgraph per chain. Keep every fallback fully paginated, and apply the same canonical market guards (for example non-zero collateral token and non-zero IRM) so one broken chain such as HyperEVM does not blank the registry for healthy chains. 51. **Fallback token-decimal integrity**: market-registry fallback paths must not invent ERC20 decimals for unknown tokens. Resolve unknown token decimals through shared chain-scoped RPC multicalls keyed by canonical `chainId + address`, and fail closed for any market whose required token metadata cannot be resolved safely. 52. **Market-detail live-state source integrity**: `useMarketData` must treat Monarch/Envio as the primary source for live market state fields that drive the market page (`supplyApy`, `borrowApy`, `apyAtTarget`, `utilization`, balances/shares/liquidity where available), while preserving fallback-shell metadata fields that Monarch does not yet expose with parity (for example `whitelisted`, `supplyingVaults`, and rolling daily/weekly/monthly APYs). Do not let broken Morpho live state leak directly into market-detail headers/charts when Monarch state is available, and do not replace the whole market shell with partial Monarch metadata. +53. **Market-registry enrichment boundary integrity**: keep the global market-registry core shell source-agnostic. Do not require provider-specific rolling history fields for first paint, and do not fill those fields with synthetic zeroes in the core registry path. Historical list enrichments such as rolling supply/borrow APYs must resolve through a separate chokepoint keyed by canonical `chainId + market.uniqueKey`, preferably from shared RPC/archive-node snapshot helpers when the primary index source does not expose that history. +54. **Market USD-price provenance integrity**: when market-list USD values are recomputed through shared token-price hooks, keep price provenance separate from the numeric USD value. Only mark a market as having a real USD price, or remove estimated-price UI, when the loan-token price came from a direct chain-scoped price source rather than a peg or hardcoded fallback. If a direct price becomes available after first paint, replace the previously estimated loan-asset USD values at the shared market-processing chokepoint instead of leaving stale estimated values and flags in place. +55. **Token metadata integrity**: chain-scoped token metadata used by market registries must treat `decimals` and `symbol` as one metadata unit. When a token is not in the local registry, resolve both fields through shared batched on-chain reads rather than mixing RPC decimals with placeholder symbols. Manual entries in `src/utils/tokens.ts` must be validated against on-chain `decimals()` and `symbol()` per chain through the shared verifier command, and any intentional display-symbol differences must be captured as explicit overrides instead of silent drift. +56. **Market shell parity integrity**: single-market market-shell fetchers must apply the same canonical registry guards as list fetchers, so blocked or malformed markets cannot re-enter through detail-path reads. In per-chain provider fallback chains, treat empty non-authoritative results the same as provider failure and continue to the next provider instead of short-circuiting fallback for that chain. ### REQUIRED: Regression Rule Capture diff --git a/docs/TECHNICAL_OVERVIEW.md b/docs/TECHNICAL_OVERVIEW.md index e47d3716..080f9a7f 100644 --- a/docs/TECHNICAL_OVERVIEW.md +++ b/docs/TECHNICAL_OVERVIEW.md @@ -203,7 +203,7 @@ Market metrics: Monarch metrics API via `/api/monarch/metrics` ### Dynamic Data (Runtime fetched) | Data Type | Source | Refresh | Query Hook | |-----------|--------|---------|------------| -| Markets list | Morpho API → Monarch API → Subgraph | 5 min stale | `useMarketsQuery` | +| Markets list | Monarch multi-chain → Morpho per chain → Subgraph per chain | 5 min stale | `useMarketsQuery` | | Market metrics (flows, trending) | Monarch API | 5 min stale | `useMarketMetricsQuery` | | Market state (APY, utilization, balances) | Monarch market state + Morpho/Subgraph shell + RPC snapshot | 30s stale | `useMarketData` | | Market historical chart series | Monarch GraphQL → Morpho API → Subgraph | 5 min stale | `useMarketHistoricalData` | @@ -231,8 +231,8 @@ Hooks omitted from this matrix are local-state hooks or pure view/composition he | Hook / Family | Responsibility | Infra Today | Full Monarch Support Still Needs | |---------------|----------------|-------------|----------------------------------| -| `useMarketsQuery` | Global market registry used across the app | Morpho API first per chain, then Monarch API, then subgraph | Rolling daily/weekly/monthly APYs plus whitelist/supplying-vault metadata parity if we ever want Monarch-first registry reads | -| `useProcessedMarkets` | Blacklist/filtering layer on top of market registry, plus USD backfill | `useMarketsQuery` + `useTokenPrices` | Inherits `useMarketsQuery`; also needs a Monarch-native token price source if we want to remove Morpho price reads | +| `useMarketsQuery` | Global market registry used across the app | Monarch multi-chain first, then Morpho per chain, then subgraph per chain | Optional metadata parity from Morpho plus any non-core enrichment we may keep outside the primary registry | +| `useProcessedMarkets` | Blacklist/filtering layer on top of market registry, plus RPC historical-rate enrichment and USD backfill | `useMarketsQuery` + RPC/archive snapshots + `useTokenPrices` | Inherits `useMarketsQuery`; also needs a Monarch-native token price source if we want to remove Morpho price reads | | `useMarketData` | Single-market detail shell with freshest live state | Monarch live-state overlay on Morpho/Subgraph shell, then RPC snapshot override | Whitelist, supplying-vault, and rolling-APY metadata parity if we want to remove the shell fallback entirely | | `useMarketHistoricalData` | Historical market chart series | Monarch historical snapshots first; Morpho API/Subgraph only for fallback | Already aligned for the current asset-only market charts | | `useTokenPrices` | Token USD price lookup and peg fallback used by markets/admin stats | Morpho price API + major price fallback | Intentionally Morpho/major-price backed today | @@ -367,9 +367,10 @@ supportsMorphoApi(network) returns true for: - Mainnet, Base, Unichain, Polygon, Arbitrum, HyperEVM, Monad Fallback Strategy: -1. IF supportsMorphoApi(network) → Try Morpho API -2. IF API fails OR unsupported → Try Subgraph -3. Each network fails independently (partial data OK) +1. `useMarketsQuery` tries one shared Monarch market-registry read first +2. Any chain missing from that result falls back independently to Morpho API when supported +3. If Morpho fails or is unsupported, that chain falls back to Subgraph +4. Each network still fails independently (partial data OK) ``` ### GraphQL Fetchers @@ -414,7 +415,7 @@ Fallback Strategy: ### Key Patterns -1. **Feature-Scoped Priority**: Monarch-first for market detail/history/activity, Morpho-first for the global market registry, Subgraph last +1. **Feature-Scoped Priority**: Monarch-first for market detail/history/activity and the global market registry core shell, Morpho/Subgraph fallback last 2. **Parallel Execution**: `Promise.all()` for multi-network 3. **Graceful Degradation**: Partial data > Error 4. **Three-Phase Market Detail**: Monarch live state + fallback shell + RPC snapshot diff --git a/package.json b/package.json index 384d885a..597c1e94 100644 --- a/package.json +++ b/package.json @@ -14,6 +14,7 @@ "stylelint": "stylelint '**/*.css' --fix", "stylelint:check": "stylelint '**/*.css'", "test:coverage:open": "pnpm test:coverage && open coverage/lcov-report/index.html", + "tokens:verify-metadata": "tsx scripts/verify-token-metadata.ts", "typecheck": "tsc --noEmit" }, "dependencies": { diff --git a/scripts/verify-token-metadata.ts b/scripts/verify-token-metadata.ts new file mode 100644 index 00000000..31143275 --- /dev/null +++ b/scripts/verify-token-metadata.ts @@ -0,0 +1,186 @@ +import { createRequire } from 'node:module'; +import type { SupportedNetworks as SupportedNetworkId } from '../src/utils/networks'; + +const moduleRequire = createRequire(import.meta.url); +const { loadEnvConfig } = moduleRequire('@next/env') as { + loadEnvConfig: (dir: string) => void; +}; + +loadEnvConfig(process.cwd()); + +const assetExtensionLoader: NodeJS.RequireExtensions[string] = (module, filename) => { + module.exports = filename; +}; + +for (const extension of ['.png', '.svg', '.webp']) { + moduleRequire.extensions[extension] = assetExtensionLoader; +} + +const { supportedTokens, infoToKey, MORPHO_LEGACY } = moduleRequire('../src/utils/tokens') as typeof import('../src/utils/tokens'); +const { SupportedNetworks, getDefaultRPC, getNetworkName } = moduleRequire( + '../src/utils/networks', +) as typeof import('../src/utils/networks'); +const { fetchOnchainTokenMetadataMap } = moduleRequire('../src/utils/tokenMetadata') as typeof import('../src/utils/tokenMetadata'); + +type VerificationEntry = { + address: string; + chainId: number; + symbol: string; + decimals: number; +}; + +type VerificationIssue = { + kind: 'decimals-mismatch' | 'missing-decimals' | 'missing-symbol' | 'symbol-mismatch'; + entry: VerificationEntry; + onchainDecimals?: number; + onchainSymbol?: string; + expectedOnchainSymbol?: string; +}; + +type ChainIssue = { + chainId: number; + message: string; +}; + +const expectedOnchainSymbolByKey = new Map([ + [infoToKey('0x35d8949372d46b7a3d5a56006ae77b215fc69bc0', SupportedNetworks.Mainnet), 'bUSD0'], + [infoToKey('0x00000000efe302beaa2b3e6e1b18d08d69a9012a', SupportedNetworks.Mainnet), 'AUSD'], + [infoToKey('0x00000000efe302beaa2b3e6e1b18d08d69a9012a', SupportedNetworks.Monad), 'AUSD'], + [infoToKey('0x8236a87084f8b84306f72007f36f2618a5634494', SupportedNetworks.Mainnet), 'LBTC'], + [infoToKey('0xecac9c5f704e954931349da37f60e39f515c11c1', SupportedNetworks.Base), 'LBTC'], + [infoToKey('0x00b174d66ada7d63789087f50a9b9e0e48446dc1', SupportedNetworks.Base), 'sPINTO'], + [infoToKey('0xb0505e5a99abd03d94a1169e638b78edfed26ea4', SupportedNetworks.Base), 'uSUI'], + [infoToKey('0xc2132d05d31c914a87c6611c10748aeb04b58e8f', SupportedNetworks.Polygon), 'USDT0'], + [infoToKey('0xe7cd86e13ac4309349f30b3435a9d337750fc82d', SupportedNetworks.Monad), 'USDT0'], + [infoToKey(MORPHO_LEGACY, SupportedNetworks.Mainnet), 'MORPHO'], +]); + +const getVerificationEntries = (): VerificationEntry[] => { + return supportedTokens.flatMap((token) => + token.networks.map((network) => ({ + address: network.address.toLowerCase(), + chainId: network.chain.id, + symbol: token.symbol, + decimals: token.decimals, + })), + ); +}; + +const formatIssue = (issue: VerificationIssue): string => { + const { address, symbol, decimals } = issue.entry; + + switch (issue.kind) { + case 'missing-decimals': + return `${symbol} (${address}): decimals() could not be read on-chain. Configured decimals=${decimals}.`; + case 'decimals-mismatch': + return `${symbol} (${address}): configured decimals=${decimals}, on-chain decimals=${issue.onchainDecimals}.`; + case 'missing-symbol': + return `${symbol} (${address}): symbol() could not be read on-chain. Configured symbol="${symbol}".`; + case 'symbol-mismatch': + return `${symbol} (${address}): configured symbol="${symbol}", expected on-chain symbol="${issue.expectedOnchainSymbol}", actual on-chain symbol="${issue.onchainSymbol}".`; + default: { + const unexpectedKind: never = issue.kind; + throw new Error(`Unknown verification issue kind: ${unexpectedKind}`); + } + } +}; + +const main = async () => { + const verificationEntries = getVerificationEntries(); + const chainConfigIssues: ChainIssue[] = []; + const entriesToVerify = verificationEntries.filter((entry) => { + const rpcUrl = getDefaultRPC(entry.chainId as SupportedNetworkId); + if (rpcUrl) { + return true; + } + + if (!chainConfigIssues.some((issue) => issue.chainId === entry.chainId)) { + chainConfigIssues.push({ + chainId: entry.chainId, + message: 'RPC is not configured for this network, so on-chain token metadata could not be verified.', + }); + } + + return false; + }); + + const metadataByToken = await fetchOnchainTokenMetadataMap( + entriesToVerify.map((entry) => ({ + address: entry.address, + chainId: entry.chainId as SupportedNetworkId, + })), + ); + + const issues: VerificationIssue[] = []; + + for (const entry of entriesToVerify) { + const key = infoToKey(entry.address, entry.chainId); + const metadata = metadataByToken.get(key); + + if (metadata?.decimals === undefined) { + issues.push({ + kind: 'missing-decimals', + entry, + }); + } else if (metadata.decimals !== entry.decimals) { + issues.push({ + kind: 'decimals-mismatch', + entry, + onchainDecimals: metadata.decimals, + }); + } + + if (!metadata?.symbol) { + issues.push({ + kind: 'missing-symbol', + entry, + }); + continue; + } + + const expectedOnchainSymbol = expectedOnchainSymbolByKey.get(key) ?? entry.symbol; + if (metadata.symbol !== expectedOnchainSymbol) { + issues.push({ + kind: 'symbol-mismatch', + entry, + onchainSymbol: metadata.symbol, + expectedOnchainSymbol, + }); + } + } + + if (issues.length === 0 && chainConfigIssues.length === 0) { + const networkCount = new Set(verificationEntries.map((entry) => entry.chainId)).size; + console.log(`Verified ${verificationEntries.length} token entries across ${networkCount} networks.`); + return; + } + + const issuesByChain = new Map(); + for (const issue of issues) { + const issuesForChain = issuesByChain.get(issue.entry.chainId) ?? []; + issuesForChain.push(issue); + issuesByChain.set(issue.entry.chainId, issuesForChain); + } + + console.error(`Token metadata verification failed with ${issues.length + chainConfigIssues.length} issue(s).`); + + for (const [chainId, issuesForChain] of Array.from(issuesByChain.entries()).sort((left, right) => left[0] - right[0])) { + console.error(`\n[${getNetworkName(chainId) ?? chainId}]`); + for (const issue of issuesForChain) { + console.error(`- ${formatIssue(issue)}`); + } + } + + for (const chainIssue of chainConfigIssues.sort((left, right) => left.chainId - right.chainId)) { + console.error(`\n[${getNetworkName(chainIssue.chainId) ?? chainIssue.chainId}]`); + console.error(`- ${chainIssue.message}`); + } + + process.exitCode = 1; +}; + +main().catch((error: unknown) => { + console.error('Token metadata verification failed with an unexpected error.'); + console.error(error); + process.exitCode = 1; +}); diff --git a/src/data-sources/monarch-api/historical.ts b/src/data-sources/monarch-api/historical.ts index bb2db18e..0f0f6b50 100644 --- a/src/data-sources/monarch-api/historical.ts +++ b/src/data-sources/monarch-api/historical.ts @@ -147,19 +147,14 @@ export const fetchMonarchMarketHistoricalData = async ( limit: MONARCH_HISTORICAL_PAGE_LIMIT, }; - const query = - options.interval === 'HOUR' - ? envioMarketHourlySnapshotsQuery - : envioMarketDailySnapshotsQuery; + const query = options.interval === 'HOUR' ? envioMarketHourlySnapshotsQuery : envioMarketDailySnapshotsQuery; try { const response = await monarchGraphqlFetcher(query, variables, { signal: controller.signal, }); const snapshots = - options.interval === 'HOUR' - ? (response.data?.MarketHourlySnapshot ?? []) - : (response.data?.MarketDailySnapshot ?? []); + options.interval === 'HOUR' ? (response.data?.MarketHourlySnapshot ?? []) : (response.data?.MarketDailySnapshot ?? []); return transformSnapshotsToHistoricalResult(snapshots); } catch (error) { diff --git a/src/data-sources/monarch-api/markets.ts b/src/data-sources/monarch-api/markets.ts index 8c86c6ed..a4b8b8a4 100644 --- a/src/data-sources/monarch-api/markets.ts +++ b/src/data-sources/monarch-api/markets.ts @@ -1,9 +1,10 @@ import { Market as BlueMarket, MarketParams as BlueMarketParams } from '@morpho-org/blue-sdk'; import { formatUnits, type Address, zeroAddress } from 'viem'; import { buildEnvioMarketsPageQuery, envioMarketByIdQuery } from '@/graphql/envio-queries'; +import { isMarketRegistryEntryAllowed } from '@/utils/markets'; import { getMorphoAddress } from '@/utils/morpho'; import { isSupportedChain, type SupportedNetworks } from '@/utils/networks'; -import { blacklistTokens, infoToKey } from '@/utils/tokens'; +import { infoToKey } from '@/utils/tokens'; import { resolveTokenInfos, type ResolvedTokenInfo, type TokenAddressInput } from '@/utils/tokenMetadata'; import type { Market, MarketWarning } from '@/utils/types'; import { UNRECOGNIZED_COLLATERAL, UNRECOGNIZED_LOAN } from '@/utils/warnings'; @@ -134,7 +135,13 @@ const mapMonarchMarketToMarket = (market: MonarchMarketRow, tokenInfos: Map & { +type MorphoApiMarketState = Omit< + Market['state'], + 'dailySupplyApy' | 'dailyBorrowApy' | 'weeklySupplyApy' | 'weeklyBorrowApy' | 'monthlySupplyApy' | 'monthlyBorrowApy' +> & + Partial< + Pick< + Market['state'], + 'dailySupplyApy' | 'dailyBorrowApy' | 'weeklySupplyApy' | 'weeklyBorrowApy' | 'monthlySupplyApy' | 'monthlyBorrowApy' + > + >; + +type MorphoApiMarket = Omit & { oracle: { address: string } | null; listed: boolean; + state: MorphoApiMarketState; + supplyingVaults?: { address: string }[]; }; type MarketGraphQLResponse = { @@ -42,17 +55,36 @@ const MORPHO_MARKETS_PAGE_BATCH_SIZE = 4; // Transform API response to internal Market type const processMarketData = (market: MorphoApiMarket): Market => { - const { oracle, listed, ...rest } = market; + const { oracle, listed, state, supplyingVaults, ...rest } = market; return { ...rest, oracleAddress: (oracle?.address ?? zeroAddress) as Address, whitelisted: listed, hasUSDPrice: true, + supplyingVaults: supplyingVaults ?? [], + state: { + ...state, + dailySupplyApy: state.dailySupplyApy ?? null, + dailyBorrowApy: state.dailyBorrowApy ?? null, + weeklySupplyApy: state.weeklySupplyApy ?? null, + weeklyBorrowApy: state.weeklyBorrowApy ?? null, + monthlySupplyApy: state.monthlySupplyApy ?? null, + monthlyBorrowApy: state.monthlyBorrowApy ?? null, + }, }; }; +const filterRegistryMarkets = (markets: Market[]): Market[] => + markets.filter((market) => + isMarketRegistryEntryAllowed({ + loanAssetAddress: market.loanAsset?.address, + collateralAssetAddress: market.collateralAsset?.address, + irmAddress: market.irmAddress, + }), + ); + // Fetcher for market details from Morpho API -export const fetchMorphoMarket = async (uniqueKey: string, network: SupportedNetworks): Promise => { +export const fetchMorphoMarket = async (uniqueKey: string, network: SupportedNetworks): Promise => { const response = await morphoGraphqlFetcher(marketDetailQuery, { uniqueKey, chainId: network, @@ -60,7 +92,10 @@ export const fetchMorphoMarket = async (uniqueKey: string, network: SupportedNet if (!response || !response.data || !response.data.marketByUniqueKey) { throw new Error('Market data not found in Morpho API response'); } - return processMarketData(response.data.marketByUniqueKey); + + const market = processMarketData(response.data.marketByUniqueKey); + + return filterRegistryMarkets([market])[0] ?? null; }; const fetchMorphoMarketsPage = async (network: SupportedNetworks, skip: number, pageSize: number): Promise => { @@ -107,11 +142,7 @@ export const fetchMorphoMarkets = async (network: SupportedNetworks): Promise 0) { console.warn('Received 0 items in the first page, but total count is positive. Returning first-page result only.'); - return allMarkets.filter( - (market) => - !blacklistTokens.includes(market.collateralAsset?.address.toLowerCase() ?? '') && - !blacklistTokens.includes(market.loanAsset?.address.toLowerCase() ?? ''), - ); + return filterRegistryMarkets(allMarkets); } const remainingOffsets: number[] = []; @@ -139,10 +170,5 @@ export const fetchMorphoMarkets = async (network: SupportedNetworks): Promise - !blacklistTokens.includes(market.collateralAsset?.address.toLowerCase() ?? '') && - !blacklistTokens.includes(market.loanAsset?.address.toLowerCase() ?? ''), - ); + return filterRegistryMarkets(allMarkets); }; diff --git a/src/data-sources/subgraph/market.ts b/src/data-sources/subgraph/market.ts index 5427cc2c..dd823959 100644 --- a/src/data-sources/subgraph/market.ts +++ b/src/data-sources/subgraph/market.ts @@ -1,6 +1,7 @@ import type { Address } from 'viem'; import { marketQuery as subgraphMarketQuery, marketsQuery as subgraphMarketsQuery } from '@/graphql/morpho-subgraph-queries'; // Assuming query is here import { formatBalance } from '@/utils/balance'; +import { isMarketRegistryEntryAllowed } from '@/utils/markets'; import type { SupportedNetworks } from '@/utils/networks'; import type { SubgraphMarket, SubgraphMarketQueryResponse, SubgraphMarketsQueryResponse, SubgraphToken } from '@/utils/subgraph-types'; import { getSubgraphUrl } from '@/utils/subgraph-urls'; @@ -33,7 +34,7 @@ const transformSubgraphMarketToMarket = ( subgraphMarket: Partial, network: SupportedNetworks, majorPrices: MajorPrices, -): Market => { +): Market | null => { const marketId = subgraphMarket.id ?? ''; const lltv = subgraphMarket.lltv ?? '0'; const irmAddress = subgraphMarket.irm ?? '0x'; @@ -65,6 +66,16 @@ const transformSubgraphMarketToMarket = ( const loanAsset = mapToken(subgraphMarket.borrowedToken); const collateralAsset = mapToken(subgraphMarket.inputToken); + if ( + !isMarketRegistryEntryAllowed({ + loanAssetAddress: loanAsset.address, + collateralAssetAddress: collateralAsset.address, + irmAddress, + }) + ) { + return null; + } + const chainId = network; const supplyAssets = subgraphMarket.totalSupply ?? subgraphMarket.inputTokenBalance ?? '0'; @@ -273,5 +284,7 @@ export const fetchSubgraphMarkets = async (network: SupportedNetworks): Promise< } const majorPrices = await majorPricesPromise; - return allMarkets.map((market) => transformSubgraphMarketToMarket(market, network, majorPrices)); + return allMarkets + .map((market) => transformSubgraphMarketToMarket(market, network, majorPrices)) + .filter((market): market is Market => market !== null); }; diff --git a/src/features/market-detail/components/charts/volume-chart.tsx b/src/features/market-detail/components/charts/volume-chart.tsx index b6b86ee6..e694a309 100644 --- a/src/features/market-detail/components/charts/volume-chart.tsx +++ b/src/features/market-detail/components/charts/volume-chart.tsx @@ -137,7 +137,8 @@ function VolumeChart({ marketId, chainId, market }: VolumeChartProps) { const validDisplayData = assetData.filter((point: AssetTimeseriesDataPoint) => point.y !== null); const average = validDisplayData.length > 0 - ? validDisplayData.reduce((acc: number, point: AssetTimeseriesDataPoint) => acc + convertValue(point.y), 0) / validDisplayData.length + ? validDisplayData.reduce((acc: number, point: AssetTimeseriesDataPoint) => acc + convertValue(point.y), 0) / + validDisplayData.length : 0; return { current, netChangePercentage, average }; diff --git a/src/graphql/morpho-api-queries.ts b/src/graphql/morpho-api-queries.ts index 23a906b1..31f68021 100644 --- a/src/graphql/morpho-api-queries.ts +++ b/src/graphql/morpho-api-queries.ts @@ -147,12 +147,6 @@ export const marketsQuery = ` timestamp apyAtTarget rateAtTarget - dailySupplyApy - dailyBorrowApy - weeklySupplyApy - weeklyBorrowApy - monthlySupplyApy - monthlyBorrowApy } warnings { type diff --git a/src/hooks/queries/useMarketRateEnrichmentQuery.ts b/src/hooks/queries/useMarketRateEnrichmentQuery.ts new file mode 100644 index 00000000..52708e16 --- /dev/null +++ b/src/hooks/queries/useMarketRateEnrichmentQuery.ts @@ -0,0 +1,26 @@ +import { useMemo } from 'react'; +import { useQuery } from '@tanstack/react-query'; +import { useCustomRpcContext } from '@/components/providers/CustomRpcProvider'; +import { fetchMarketRateEnrichment, getMarketRateEnrichmentKey, type MarketRateEnrichmentMap } from '@/utils/market-rate-enrichment'; +import type { Market } from '@/utils/types'; + +const EMPTY_ENRICHMENT_MAP: MarketRateEnrichmentMap = new Map(); + +export const useMarketRateEnrichmentQuery = (markets: Market[]) => { + const { customRpcUrls, rpcConfigVersion } = useCustomRpcContext(); + + const marketIdentity = useMemo( + () => markets.map((market) => getMarketRateEnrichmentKey(market.uniqueKey, market.morphoBlue.chain.id)).sort(), + [markets], + ); + + return useQuery({ + queryKey: ['market-rate-enrichment', marketIdentity, rpcConfigVersion], + queryFn: async () => fetchMarketRateEnrichment(markets, customRpcUrls), + enabled: markets.length > 0, + staleTime: 15 * 60 * 1000, + gcTime: 30 * 60 * 1000, + refetchOnWindowFocus: false, + placeholderData: (previousData) => previousData ?? EMPTY_ENRICHMENT_MAP, + }); +}; diff --git a/src/hooks/queries/useMarketsQuery.ts b/src/hooks/queries/useMarketsQuery.ts index d0d9e4db..f2f20392 100644 --- a/src/hooks/queries/useMarketsQuery.ts +++ b/src/hooks/queries/useMarketsQuery.ts @@ -3,7 +3,7 @@ import { supportsMorphoApi } from '@/config/dataSources'; import { fetchMonarchMarkets } from '@/data-sources/monarch-api'; import { fetchMorphoMarkets } from '@/data-sources/morpho-api/market'; import { fetchSubgraphMarkets } from '@/data-sources/subgraph/market'; -import { ALL_SUPPORTED_NETWORKS, isSupportedChain } from '@/utils/networks'; +import { ALL_SUPPORTED_NETWORKS, isSupportedChain, type SupportedNetworks } from '@/utils/networks'; import type { Market } from '@/utils/types'; const toError = (error: unknown): Error => { @@ -15,10 +15,10 @@ const toError = (error: unknown): Error => { * Fetches markets from all supported networks using React Query. * * Data fetching strategy: - * - Tries Morpho API first (if supported) - * - Falls back to Monarch API if Morpho fails - * - Falls back to Subgraph if Monarch fails or Morpho is unsupported - * - Combines markets from all networks + * - Tries Monarch API first using one shared multi-chain registry request + * - Falls back per chain to Morpho API if Monarch is missing that chain or the multi-chain request fails + * - Falls back per chain to Subgraph if Morpho fails or is unsupported + * - Combines markets from all networks while preserving per-chain isolation * - Applies basic filtering (required fields, supported chains) * * Cache behavior: @@ -35,59 +35,98 @@ export const useMarketsQuery = () => { return useQuery({ queryKey: ['markets'], queryFn: async () => { - const combinedMarkets: Market[] = []; const fetchErrors: Error[] = []; + const marketsByChain = new Map(); + const missingNetworks = new Set(ALL_SUPPORTED_NETWORKS); - // Fetch markets for each network based on its data source. - // Use allSettled so a single chain failure cannot reject the whole query. - const results = await Promise.allSettled( - ALL_SUPPORTED_NETWORKS.map(async (network) => { - let networkMarkets: Market[] = []; - let tryMonarch = false; - let trySubgraph = !supportsMorphoApi(network); - - // Try Morpho API first if supported - if (!trySubgraph) { - try { - networkMarkets = await fetchMorphoMarkets(network); - } catch (error) { - console.warn(`Morpho markets failed for network ${network}, falling back to Monarch API.`, error); - tryMonarch = true; - } + const setMarketsForChain = (network: SupportedNetworks, markets: Market[]) => { + if (markets.length === 0) { + return; + } + + marketsByChain.set(network, markets); + missingNetworks.delete(network); + }; + + const partitionMarketsByChain = (markets: Market[]): Map => { + const grouped = new Map(); + + for (const market of markets) { + const chainId = market.morphoBlue.chain.id; + if (!isSupportedChain(chainId)) { + continue; } - // If Morpho API failed, try Monarch before Subgraph - if (tryMonarch) { - try { - networkMarkets = await fetchMonarchMarkets(network); - } catch (error) { - console.warn(`Monarch markets failed for network ${network}, falling back to Subgraph.`, error); - trySubgraph = true; + const chainMarkets = grouped.get(chainId) ?? []; + chainMarkets.push(market); + grouped.set(chainId, chainMarkets); + } + + return grouped; + }; + + try { + const monarchMarkets = await fetchMonarchMarkets(); + const monarchMarketsByChain = partitionMarketsByChain(monarchMarkets); + + for (const [network, markets] of monarchMarketsByChain.entries()) { + setMarketsForChain(network, markets); + } + } catch (error) { + const monarchError = toError(error); + console.warn('Monarch multi-chain markets fetch failed. Falling back per chain to Morpho/Subgraph.', monarchError); + fetchErrors.push(monarchError); + } + + const fetchFallbackMarketsForNetwork = async (network: SupportedNetworks): Promise<{ network: SupportedNetworks; markets: Market[] }> => { + if (supportsMorphoApi(network)) { + try { + const morphoMarkets = await fetchMorphoMarkets(network); + if (morphoMarkets.length > 0) { + return { + network, + markets: morphoMarkets, + }; } - } - // If Morpho is unsupported or Monarch fallback failed, try Subgraph - if (trySubgraph) { - networkMarkets = await fetchSubgraphMarkets(network); + console.warn(`Morpho markets returned empty for network ${network}, falling back to Subgraph.`); + } catch (error) { + console.warn(`Morpho markets failed for network ${network}, falling back to Subgraph.`, error); } + } - return networkMarkets; - }), - ); + return { + network, + markets: await fetchSubgraphMarkets(network), + }; + }; + + const fallbackNetworks = Array.from(missingNetworks); + const results = await Promise.allSettled(fallbackNetworks.map((network) => fetchFallbackMarketsForNetwork(network))); results.forEach((result, index) => { if (result.status === 'fulfilled') { - combinedMarkets.push(...result.value); + setMarketsForChain(result.value.network, result.value.markets); } else { - const network = ALL_SUPPORTED_NETWORKS[index]; + const network = fallbackNetworks[index]; const error = toError(result.reason); console.error(`Failed to fetch markets for network ${network}:`, error); fetchErrors.push(error); } }); + const combinedMarkets = Array.from(marketsByChain.values()).flat(); + const dedupedMarkets = Array.from( + combinedMarkets + .reduce((acc, market) => { + acc.set(`${market.morphoBlue.chain.id}-${market.uniqueKey.toLowerCase()}`, market); + return acc; + }, new Map()) + .values(), + ); + // Apply basic filtering - const filtered = combinedMarkets + const filtered = dedupedMarkets .filter((market) => market.uniqueKey !== undefined) .filter((market) => market.loanAsset && market.collateralAsset) .filter((market) => isSupportedChain(market.morphoBlue.chain.id)); diff --git a/src/hooks/useFreshMarketsState.ts b/src/hooks/useFreshMarketsState.ts index 82631693..35e88e7b 100644 --- a/src/hooks/useFreshMarketsState.ts +++ b/src/hooks/useFreshMarketsState.ts @@ -1,21 +1,12 @@ import { useEffect, useMemo } from 'react'; import { useQuery, useQueryClient } from '@tanstack/react-query'; import { usePublicClient } from 'wagmi'; -import morphoABI from '@/abis/morpho'; -import { getMorphoAddress } from '@/utils/morpho'; +import { fetchMarketsSnapshots } from '@/utils/positions'; import type { SupportedNetworks } from '@/utils/networks'; import type { Market } from '@/utils/types'; const REFRESH_INTERVAL = 15_000; // 15 seconds -type MarketSnapshot = { - totalSupplyAssets: string; - totalSupplyShares: string; - totalBorrowAssets: string; - totalBorrowShares: string; - liquidityAssets: string; -}; - /** * Hook to fetch fresh market states using multicall. * Works efficiently for both single and multiple markets. @@ -68,48 +59,13 @@ export const useFreshMarketsState = ( } console.log(`Reading fresh state for ${markets.length} markets from chain...`); - - // Create multicall contracts for all markets - const contracts = markets.map((market) => ({ - address: getMorphoAddress(effectiveChainId) as `0x${string}`, - abi: morphoABI, - functionName: 'market' as const, - args: [market.uniqueKey as `0x${string}`], - })); - - // Use multicall to batch all market queries into a single RPC call - const results = await publicClient.multicall({ - contracts, - allowFailure: true, - }); - + const snapshotsMap = await fetchMarketsSnapshots( + markets.map((market) => market.uniqueKey), + effectiveChainId, + publicClient, + ); console.log(`complete reading ${markets.length} market states`); - // Process results into snapshots map - const snapshotsMap = new Map(); - - results.forEach((result, index) => { - const market = markets[index]; - if (result.status === 'success' && result.result) { - const data = result.result as readonly bigint[]; - const totalSupplyAssets = data[0]; - const totalSupplyShares = data[1]; - const totalBorrowAssets = data[2]; - const totalBorrowShares = data[3]; - const liquidityAssets = totalSupplyAssets - totalBorrowAssets; - - snapshotsMap.set(market.uniqueKey, { - totalSupplyAssets: totalSupplyAssets.toString(), - totalSupplyShares: totalSupplyShares.toString(), - totalBorrowAssets: totalBorrowAssets.toString(), - totalBorrowShares: totalBorrowShares.toString(), - liquidityAssets: liquidityAssets.toString(), - }); - } else { - console.warn(`Failed to fetch snapshot for market ${market.uniqueKey}`); - } - }); - return snapshotsMap; }, enabled: !!markets && markets.length > 0 && !!effectiveChainId && !!publicClient, @@ -137,7 +93,7 @@ export const useFreshMarketsState = ( if (!snapshots) return markets; return markets.map((market) => { - const snapshot = snapshots.get(market.uniqueKey); + const snapshot = snapshots.get(market.uniqueKey.toLowerCase()); if (!snapshot) return market; return { diff --git a/src/hooks/useProcessedMarkets.ts b/src/hooks/useProcessedMarkets.ts index 591d56bb..5c0319c6 100644 --- a/src/hooks/useProcessedMarkets.ts +++ b/src/hooks/useProcessedMarkets.ts @@ -1,14 +1,18 @@ import { useMemo } from 'react'; +import { useMarketRateEnrichmentQuery } from '@/hooks/queries/useMarketRateEnrichmentQuery'; import { useMarketsQuery } from '@/hooks/queries/useMarketsQuery'; import { useTokenPrices } from '@/hooks/useTokenPrices'; import { useBlacklistedMarkets } from '@/stores/useBlacklistedMarkets'; import { useAppSettings } from '@/stores/useAppSettings'; +import { getMarketRateEnrichmentKey, type MarketRateEnrichmentMap } from '@/utils/market-rate-enrichment'; import { isForceUnwhitelisted } from '@/utils/markets'; import { getTokenPriceKey } from '@/data-sources/morpho-api/prices'; import { formatBalance } from '@/utils/balance'; import type { TokenPriceInput } from '@/data-sources/morpho-api/prices'; import type { Market } from '@/utils/types'; +const EMPTY_RATE_ENRICHMENTS: MarketRateEnrichmentMap = new Map(); + const hasPositiveAssets = (value?: string): boolean => { if (!value) return false; try { @@ -28,6 +32,11 @@ const shouldComputeUsd = (usdValue: number | null | undefined, assets?: string): return false; }; +const shouldResolveUsdValue = (usdValue: number | null | undefined, assets: string | undefined, replaceEstimated: boolean): boolean => { + if (replaceEstimated) return hasPositiveAssets(assets); + return shouldComputeUsd(usdValue, assets); +}; + const computeUsdValue = (assets: string, decimals: number, price: number): number => { return formatBalance(assets, decimals) * price; }; @@ -40,7 +49,7 @@ const computeUsdValue = (assets: string, decimals: number, price: number): numbe * Processing steps: * 1. Get raw markets from React Query * 2. Remove blacklisted markets - * 3. Enrich with oracle data + * 3. Enrich optional historical market rates via RPC/archive-node snapshots * 4. Separate into allMarkets and whitelistedMarkets * * @returns Processed markets with loading states @@ -99,9 +108,34 @@ export const useProcessedMarkets = () => { }; }, [rawMarketsFromQuery, allBlacklistedMarketKeys]); - // Build token list for USD fallbacks only when needed - const tokensForUsdFallback = useMemo(() => { - if (!processedData.allMarkets.length) return []; + const { data: marketRateEnrichments = EMPTY_RATE_ENRICHMENTS, isRefetching: isRateEnrichmentRefetching } = useMarketRateEnrichmentQuery( + processedData.allMarkets, + ); + + const allMarketsWithRates = useMemo(() => { + if (!processedData.allMarkets.length || marketRateEnrichments.size === 0) { + return processedData.allMarkets; + } + + return processedData.allMarkets.map((market) => { + const enrichment = marketRateEnrichments.get(getMarketRateEnrichmentKey(market.uniqueKey, market.morphoBlue.chain.id)); + if (!enrichment) { + return market; + } + + return { + ...market, + state: { + ...market.state, + ...enrichment, + }, + }; + }); + }, [processedData.allMarkets, marketRateEnrichments]); + + // Build token list only for markets whose USD values need to be backfilled or upgraded from estimated prices. + const tokensForUsdResolution = useMemo(() => { + if (!allMarketsWithRates.length) return []; const tokens: TokenPriceInput[] = []; const seen = new Set(); @@ -113,10 +147,15 @@ export const useProcessedMarkets = () => { tokens.push({ address, chainId }); }; - processedData.allMarkets.forEach((market) => { + allMarketsWithRates.forEach((market) => { const chainId = market.morphoBlue.chain.id; + const hasLoanExposure = + hasPositiveAssets(market.state?.supplyAssets) || + hasPositiveAssets(market.state?.borrowAssets) || + hasPositiveAssets(market.state?.liquidityAssets); const needsLoanUsd = + (!market.hasUSDPrice && hasLoanExposure) || shouldComputeUsd(market.state?.supplyAssetsUsd, market.state?.supplyAssets) || shouldComputeUsd(market.state?.borrowAssetsUsd, market.state?.borrowAssets) || shouldComputeUsd(market.state?.liquidityAssetsUsd, market.state?.liquidityAssets); @@ -133,32 +172,37 @@ export const useProcessedMarkets = () => { }); return tokens; - }, [processedData.allMarkets]); + }, [allMarketsWithRates]); - const { prices: tokenPrices } = useTokenPrices(tokensForUsdFallback); + const { prices: tokenPrices, directPriceKeys } = useTokenPrices(tokensForUsdResolution); const allMarketsWithUsd = useMemo(() => { - if (!processedData.allMarkets.length) return processedData.allMarkets; - if (tokensForUsdFallback.length === 0 || tokenPrices.size === 0) return processedData.allMarkets; + if (!allMarketsWithRates.length) return allMarketsWithRates; + if (tokensForUsdResolution.length === 0 || tokenPrices.size === 0) return allMarketsWithRates; - return processedData.allMarkets.map((market) => { + return allMarketsWithRates.map((market) => { const chainId = market.morphoBlue.chain.id; - const loanPrice = tokenPrices.get(getTokenPriceKey(market.loanAsset.address, chainId)); - const collateralPrice = tokenPrices.get(getTokenPriceKey(market.collateralAsset.address, chainId)); + const loanPriceKey = getTokenPriceKey(market.loanAsset.address, chainId); + const collateralPriceKey = getTokenPriceKey(market.collateralAsset.address, chainId); + const loanPrice = tokenPrices.get(loanPriceKey); + const collateralPrice = tokenPrices.get(collateralPriceKey); + const hasDirectLoanPrice = directPriceKeys.has(loanPriceKey); + const shouldReplaceEstimatedLoanUsd = !market.hasUSDPrice && hasDirectLoanPrice; + const shouldReplaceEstimatedCollateralUsd = !market.hasUSDPrice && directPriceKeys.has(collateralPriceKey); let nextState = market.state; let changed = false; if (loanPrice !== undefined && Number.isFinite(loanPrice)) { - if (shouldComputeUsd(nextState.supplyAssetsUsd, nextState.supplyAssets)) { + if (shouldResolveUsdValue(nextState.supplyAssetsUsd, nextState.supplyAssets, shouldReplaceEstimatedLoanUsd)) { nextState = { ...nextState, supplyAssetsUsd: computeUsdValue(nextState.supplyAssets, market.loanAsset.decimals, loanPrice) }; changed = true; } - if (shouldComputeUsd(nextState.borrowAssetsUsd, nextState.borrowAssets)) { + if (shouldResolveUsdValue(nextState.borrowAssetsUsd, nextState.borrowAssets, shouldReplaceEstimatedLoanUsd)) { nextState = { ...nextState, borrowAssetsUsd: computeUsdValue(nextState.borrowAssets, market.loanAsset.decimals, loanPrice) }; changed = true; } - if (shouldComputeUsd(nextState.liquidityAssetsUsd, nextState.liquidityAssets)) { + if (shouldResolveUsdValue(nextState.liquidityAssetsUsd, nextState.liquidityAssets, shouldReplaceEstimatedLoanUsd)) { nextState = { ...nextState, liquidityAssetsUsd: computeUsdValue(nextState.liquidityAssets, market.loanAsset.decimals, loanPrice), @@ -170,7 +214,7 @@ export const useProcessedMarkets = () => { if ( collateralPrice !== undefined && Number.isFinite(collateralPrice) && - shouldComputeUsd(nextState.collateralAssetsUsd ?? null, nextState.collateralAssets) + shouldResolveUsdValue(nextState.collateralAssetsUsd ?? null, nextState.collateralAssets, shouldReplaceEstimatedCollateralUsd) ) { nextState = { ...nextState, @@ -179,9 +223,19 @@ export const useProcessedMarkets = () => { changed = true; } - return changed ? { ...market, state: nextState } : market; + const nextHasUsdPrice = market.hasUSDPrice || hasDirectLoanPrice; + + if (!changed && nextHasUsdPrice === market.hasUSDPrice) { + return market; + } + + return { + ...market, + state: nextState, + hasUSDPrice: nextHasUsdPrice, + }; }); - }, [processedData.allMarkets, tokenPrices, tokensForUsdFallback]); + }, [allMarketsWithRates, directPriceKeys, tokenPrices, tokensForUsdResolution]); const whitelistedMarketsWithUsd = useMemo(() => { return allMarketsWithUsd.filter((market) => market.whitelisted); @@ -198,7 +252,7 @@ export const useProcessedMarkets = () => { whitelistedMarkets: whitelistedMarketsWithUsd, markets, // Computed from setting (backward compatible with old context) loading: isLoading, - isRefetching, + isRefetching: isRefetching || isRateEnrichmentRefetching, error, refetch, }; diff --git a/src/hooks/useTokenPrices.ts b/src/hooks/useTokenPrices.ts index 68d0a9f3..9448a591 100644 --- a/src/hooks/useTokenPrices.ts +++ b/src/hooks/useTokenPrices.ts @@ -20,6 +20,7 @@ export const tokenPriceKeys = { type UseTokenPricesReturn = { prices: Map; + directPriceKeys: Set; isLoading: boolean; error: Error | null; }; @@ -173,8 +174,13 @@ export const useTokenPrices = (tokens: TokenPriceInput[]): UseTokenPricesReturn return resolvedPrices; }, [prices, stableTokens, tokensWithPegRefs, majorPrices]); + const directPriceKeys = useMemo(() => { + return new Set((prices ?? new Map()).keys()); + }, [prices]); + return { prices: pricesWithFallback, + directPriceKeys, isLoading, error: error ?? null, }; diff --git a/src/utils/market-rate-enrichment.ts b/src/utils/market-rate-enrichment.ts new file mode 100644 index 00000000..d43db8fe --- /dev/null +++ b/src/utils/market-rate-enrichment.ts @@ -0,0 +1,176 @@ +import { computeAnnualizedApyFromGrowth } from '@/hooks/leverage/math'; +import { fetchBlocksWithTimestamps } from '@/utils/blockEstimation'; +import { type MarketSnapshot, fetchMarketsSnapshots } from '@/utils/positions'; +import { getClient } from '@/utils/rpc'; +import type { CustomRpcUrls } from '@/stores/useCustomRpc'; +import type { SupportedNetworks } from '@/utils/networks'; +import type { Market } from '@/utils/types'; + +const SECONDS_PER_DAY = 24 * 60 * 60; +const SHARE_PRICE_SCALE = 10n ** 18n; + +const LOOKBACK_WINDOWS = [ + { + key: 'daily', + seconds: SECONDS_PER_DAY, + supplyField: 'dailySupplyApy', + borrowField: 'dailyBorrowApy', + }, + { + key: 'weekly', + seconds: 7 * SECONDS_PER_DAY, + supplyField: 'weeklySupplyApy', + borrowField: 'weeklyBorrowApy', + }, + { + key: 'monthly', + seconds: 30 * SECONDS_PER_DAY, + supplyField: 'monthlySupplyApy', + borrowField: 'monthlyBorrowApy', + }, +] as const; + +export type MarketRateEnrichment = Pick< + Market['state'], + 'dailySupplyApy' | 'dailyBorrowApy' | 'weeklySupplyApy' | 'weeklyBorrowApy' | 'monthlySupplyApy' | 'monthlyBorrowApy' +>; + +export type MarketRateEnrichmentMap = Map; + +const buildNullEnrichment = (): MarketRateEnrichment => ({ + dailySupplyApy: null, + dailyBorrowApy: null, + weeklySupplyApy: null, + weeklyBorrowApy: null, + monthlySupplyApy: null, + monthlyBorrowApy: null, +}); + +export const getMarketRateEnrichmentKey = (marketId: string, chainId: number): string => `${chainId}-${marketId.toLowerCase()}`; + +const computeSharePrice = (assets: string, shares: string): bigint | null => { + try { + const assetAmount = BigInt(assets); + const shareAmount = BigInt(shares); + + if (assetAmount <= 0n || shareAmount <= 0n) { + return null; + } + + return (assetAmount * SHARE_PRICE_SCALE) / shareAmount; + } catch { + return null; + } +}; + +const computeRealizedRate = ({ + currentSnapshot, + pastSnapshot, + periodSeconds, + side, +}: { + currentSnapshot: MarketSnapshot | undefined; + pastSnapshot: MarketSnapshot | undefined; + periodSeconds: number; + side: 'supply' | 'borrow'; +}): number | null => { + if (!currentSnapshot || !pastSnapshot || periodSeconds <= 0) { + return null; + } + + const currentSharePrice = + side === 'supply' + ? computeSharePrice(currentSnapshot.totalSupplyAssets, currentSnapshot.totalSupplyShares) + : computeSharePrice(currentSnapshot.totalBorrowAssets, currentSnapshot.totalBorrowShares); + const pastSharePrice = + side === 'supply' + ? computeSharePrice(pastSnapshot.totalSupplyAssets, pastSnapshot.totalSupplyShares) + : computeSharePrice(pastSnapshot.totalBorrowAssets, pastSnapshot.totalBorrowShares); + + if (!currentSharePrice || !pastSharePrice) { + return null; + } + + return computeAnnualizedApyFromGrowth({ + currentValue: currentSharePrice, + pastValue: pastSharePrice, + periodSeconds, + }); +}; + +export async function fetchMarketRateEnrichment(markets: Market[], customRpcUrls: CustomRpcUrls = {}): Promise { + const enrichments = new Map(); + + if (markets.length === 0) { + return enrichments; + } + + const marketsByChain = markets.reduce( + (acc, market) => { + const chainId = market.morphoBlue.chain.id; + const chainMarkets = acc[chainId] ?? []; + chainMarkets.push(market); + acc[chainId] = chainMarkets; + return acc; + }, + {} as Record, + ); + + await Promise.all( + Object.entries(marketsByChain).map(async ([chainIdValue, chainMarkets]) => { + const chainId = Number(chainIdValue) as SupportedNetworks; + + try { + const client = getClient(chainId, customRpcUrls[chainId]); + const currentBlock = await client.getBlockNumber(); + const currentBlockData = await client.getBlock({ blockNumber: currentBlock }); + const currentTimestamp = Number(currentBlockData.timestamp); + const targetTimestamps = LOOKBACK_WINDOWS.map((window) => currentTimestamp - window.seconds); + const blocksWithTimestamps = await fetchBlocksWithTimestamps( + client, + chainId, + targetTimestamps, + Number(currentBlock), + currentTimestamp, + ); + const marketIds = chainMarkets.map((market) => market.uniqueKey); + + const [currentSnapshots, ...pastSnapshots] = await Promise.all([ + fetchMarketsSnapshots(marketIds, chainId, client, Number(currentBlock)), + ...blocksWithTimestamps.map((block) => fetchMarketsSnapshots(marketIds, chainId, client, block.blockNumber)), + ]); + + for (const market of chainMarkets) { + const enrichment = buildNullEnrichment(); + const marketKey = getMarketRateEnrichmentKey(market.uniqueKey, chainId); + const currentSnapshot = currentSnapshots.get(market.uniqueKey.toLowerCase()); + + LOOKBACK_WINDOWS.forEach((window, index) => { + const pastBlock = blocksWithTimestamps[index]; + const pastSnapshot = pastSnapshots[index]?.get(market.uniqueKey.toLowerCase()); + const periodSeconds = pastBlock ? currentTimestamp - pastBlock.timestamp : 0; + + enrichment[window.supplyField] = computeRealizedRate({ + currentSnapshot, + pastSnapshot, + periodSeconds, + side: 'supply', + }); + enrichment[window.borrowField] = computeRealizedRate({ + currentSnapshot, + pastSnapshot, + periodSeconds, + side: 'borrow', + }); + }); + + enrichments.set(marketKey, enrichment); + } + } catch (error) { + console.warn(`[market-rate-enrichment] Failed to compute historical rates for chain ${chainId}:`, error); + } + }), + ); + + return enrichments; +} diff --git a/src/utils/markets.ts b/src/utils/markets.ts index e9e8236c..986ab10d 100644 --- a/src/utils/markets.ts +++ b/src/utils/markets.ts @@ -1,3 +1,8 @@ +import { isAddress, zeroAddress } from 'viem'; +import { blacklistTokens } from '@/utils/tokens'; + +const ZERO_ADDRESS = zeroAddress.toLowerCase(); + /** * Parse and normalize a numeric threshold value from user input. * - Empty string or "0" → 0 (no threshold) @@ -17,6 +22,42 @@ export const parseNumericThreshold = (rawValue: string | undefined | null): numb return Math.max(parsed, 0); }; +const normalizeAddress = (value: string | undefined | null): string => value?.toLowerCase() ?? ''; +const isValidRegistryAddress = (value: string): boolean => value.length > 0 && isAddress(value); + +export const isBlockedMarketToken = (address: string | undefined | null): boolean => { + const normalized = normalizeAddress(address); + return normalized.length > 0 && blacklistTokens.includes(normalized); +}; + +export const isMarketRegistryEntryAllowed = ({ + loanAssetAddress, + collateralAssetAddress, + irmAddress, +}: { + loanAssetAddress: string | undefined | null; + collateralAssetAddress: string | undefined | null; + irmAddress: string | undefined | null; +}): boolean => { + const normalizedLoanAsset = normalizeAddress(loanAssetAddress); + const normalizedCollateralAsset = normalizeAddress(collateralAssetAddress); + const normalizedIrm = normalizeAddress(irmAddress); + + if (!isValidRegistryAddress(normalizedLoanAsset) || !isValidRegistryAddress(normalizedCollateralAsset) || !isValidRegistryAddress(normalizedIrm)) { + return false; + } + + if (normalizedCollateralAsset === ZERO_ADDRESS || normalizedIrm === ZERO_ADDRESS) { + return false; + } + + if (isBlockedMarketToken(normalizedLoanAsset) || isBlockedMarketToken(normalizedCollateralAsset)) { + return false; + } + + return true; +}; + // Blacklisted markets by uniqueKey export const blacklistedMarkets = [ '0x8eaf7b29f02ba8d8c1d7aeb587403dcb16e2e943e4e2f5f94b0963c2386406c9', // PAXG / USDC market with wrong oracle diff --git a/src/utils/positions.ts b/src/utils/positions.ts index 39ede40c..7f48d06c 100644 --- a/src/utils/positions.ts +++ b/src/utils/positions.ts @@ -21,6 +21,9 @@ export type MarketSnapshot = { liquidityAssets: string; }; +const MARKET_SNAPSHOT_BATCH_SIZE = 200; +const MARKET_SNAPSHOT_PARALLEL_BATCHES = 4; + // Types for contract responses type Position = { supplyShares: bigint; @@ -324,38 +327,89 @@ export async function fetchMarketSnapshot( client: PublicClient, blockNumber?: number, ): Promise { + const snapshots = await fetchMarketsSnapshots([marketId], chainId, client, blockNumber); + return snapshots.get(marketId.toLowerCase()) ?? null; +} + +export async function fetchMarketsSnapshots( + marketIds: string[], + chainId: number, + client: PublicClient, + blockNumber?: number, +): Promise> { + const snapshots = new Map(); + + if (marketIds.length === 0) { + return snapshots; + } + try { const isLatest = blockNumber === undefined; + const morphoAddress = getMorphoAddress(chainId as SupportedNetworks); - // Get the market data - const marketArray = (await client.readContract({ - address: getMorphoAddress(chainId as SupportedNetworks), - abi: morphoABI, - functionName: 'market', - args: [marketId as `0x${string}`], - blockNumber: isLatest ? undefined : BigInt(blockNumber), - })) as readonly bigint[]; + for (let waveStart = 0; waveStart < marketIds.length; waveStart += MARKET_SNAPSHOT_BATCH_SIZE * MARKET_SNAPSHOT_PARALLEL_BATCHES) { + const waveChunks: string[][] = []; + + for (let chunkIndex = 0; chunkIndex < MARKET_SNAPSHOT_PARALLEL_BATCHES; chunkIndex += 1) { + const chunkStart = waveStart + chunkIndex * MARKET_SNAPSHOT_BATCH_SIZE; + if (chunkStart >= marketIds.length) { + break; + } - // Convert array to market object - const market = arrayToMarket(marketArray); + waveChunks.push(marketIds.slice(chunkStart, chunkStart + MARKET_SNAPSHOT_BATCH_SIZE)); + } - const liquidityAssets = market.totalSupplyAssets - market.totalBorrowAssets; + const waveResults = await Promise.all( + waveChunks.map((marketChunk) => + client.multicall({ + contracts: marketChunk.map((currentMarketId) => ({ + address: morphoAddress as `0x${string}`, + abi: morphoABI, + functionName: 'market' as const, + args: [currentMarketId as `0x${string}`], + })), + allowFailure: true, + blockNumber: isLatest ? undefined : BigInt(blockNumber), + }), + ), + ); + + waveResults.forEach((results, waveIndex) => { + const marketChunk = waveChunks[waveIndex] ?? []; + + results.forEach((result, resultIndex) => { + if (result.status !== 'success' || !result.result) { + return; + } + + const marketId = marketChunk[resultIndex]; + if (!marketId) { + return; + } - return { - totalSupplyAssets: market.totalSupplyAssets.toString(), - totalSupplyShares: market.totalSupplyShares.toString(), - totalBorrowAssets: market.totalBorrowAssets.toString(), - totalBorrowShares: market.totalBorrowShares.toString(), - liquidityAssets: liquidityAssets.toString(), - }; + const market = arrayToMarket(result.result as readonly bigint[]); + const liquidityAssets = market.totalSupplyAssets - market.totalBorrowAssets; + + snapshots.set(marketId.toLowerCase(), { + totalSupplyAssets: market.totalSupplyAssets.toString(), + totalSupplyShares: market.totalSupplyShares.toString(), + totalBorrowAssets: market.totalBorrowAssets.toString(), + totalBorrowShares: market.totalBorrowShares.toString(), + liquidityAssets: liquidityAssets.toString(), + }); + }); + }); + } + + return snapshots; } catch (error) { console.error('Error reading market:', { - marketId, + marketIds, chainId, blockNumber, error, }); - return null; + return snapshots; } } diff --git a/src/utils/tokenMetadata.ts b/src/utils/tokenMetadata.ts index 68ec6dc0..47554e1f 100644 --- a/src/utils/tokenMetadata.ts +++ b/src/utils/tokenMetadata.ts @@ -1,4 +1,4 @@ -import { erc20Abi, type Address } from 'viem'; +import { erc20Abi, parseAbi, type Address, type Hex } from 'viem'; import type { SupportedNetworks } from './networks'; import { getClient } from './rpc'; import { findToken, infoToKey } from './tokens'; @@ -14,7 +14,13 @@ export type ResolvedTokenInfo = { isRecognized: boolean; }; -const TOKEN_DECIMALS_BATCH_SIZE = 200; +export type OnchainTokenMetadata = { + decimals?: number; + symbol?: string; +}; + +const TOKEN_METADATA_BATCH_SIZE = 200; +const erc20SymbolBytes32Abi = parseAbi(['function symbol() view returns (bytes32)']); const normalizeAddress = (value: string): string => value.toLowerCase(); @@ -23,6 +29,27 @@ const formatUnknownTokenLabel = (address: string): string => { return `${normalizedAddress.slice(0, 6)}...${normalizedAddress.slice(-4)}`; }; +const normalizeTokenSymbol = (value: string | null | undefined): string | undefined => { + const trimmed = value?.trim(); + return trimmed ? trimmed : undefined; +}; + +const decodeBytes32Symbol = (value: Hex): string | undefined => { + const hexValue = value.slice(2); + let decoded = ''; + + for (let index = 0; index < hexValue.length; index += 2) { + const codePoint = Number.parseInt(hexValue.slice(index, index + 2), 16); + if (codePoint === 0) { + break; + } + + decoded += String.fromCharCode(codePoint); + } + + return normalizeTokenSymbol(decoded); +}; + const dedupeTokenInputs = (tokens: TokenAddressInput[]): TokenAddressInput[] => { const deduped = new Map(); @@ -41,54 +68,146 @@ const dedupeTokenInputs = (tokens: TokenAddressInput[]): TokenAddressInput[] => return Array.from(deduped.values()); }; -export const fetchTokenDecimalsMap = async (tokens: TokenAddressInput[]): Promise> => { - const uniqueTokens = dedupeTokenInputs(tokens); - const decimalsByToken = new Map(); - const unresolvedByChain = new Map(); +const groupTokenInputsByChain = (tokens: TokenAddressInput[]): Map => { + const tokensByChain = new Map(); - for (const token of uniqueTokens) { - const knownToken = findToken(token.address, token.chainId); + for (const token of tokens) { + const tokensForChain = tokensByChain.get(token.chainId) ?? []; + tokensForChain.push(token); + tokensByChain.set(token.chainId, tokensForChain); + } - if (knownToken) { - decimalsByToken.set(infoToKey(token.address, token.chainId), knownToken.decimals); - continue; - } + return tokensByChain; +}; - const tokensForChain = unresolvedByChain.get(token.chainId) ?? []; - tokensForChain.push(token); - unresolvedByChain.set(token.chainId, tokensForChain); +const getOrCreateTokenMetadata = (metadataByToken: Map, key: string): OnchainTokenMetadata => { + const existing = metadataByToken.get(key); + if (existing) { + return existing; } - for (const [chainId, tokensForChain] of unresolvedByChain) { + const metadata: OnchainTokenMetadata = {}; + metadataByToken.set(key, metadata); + return metadata; +}; + +export const fetchOnchainTokenMetadataMap = async (tokens: TokenAddressInput[]): Promise> => { + const uniqueTokens = dedupeTokenInputs(tokens); + const metadataByToken = new Map(); + + for (const [chainId, tokensForChain] of groupTokenInputsByChain(uniqueTokens)) { const client = getClient(chainId); - for (let start = 0; start < tokensForChain.length; start += TOKEN_DECIMALS_BATCH_SIZE) { - const tokenBatch = tokensForChain.slice(start, start + TOKEN_DECIMALS_BATCH_SIZE); - const results = await client.multicall({ - contracts: tokenBatch.map((token) => ({ + + for (let start = 0; start < tokensForChain.length; start += TOKEN_METADATA_BATCH_SIZE) { + const tokenBatch = tokensForChain.slice(start, start + TOKEN_METADATA_BATCH_SIZE); + const [decimalsResults, symbolResults] = await Promise.all([ + client.multicall({ + contracts: tokenBatch.map((token) => ({ + address: token.address as Address, + abi: erc20Abi, + functionName: 'decimals' as const, + })), + allowFailure: true, + }), + client.multicall({ + contracts: tokenBatch.map((token) => ({ + address: token.address as Address, + abi: erc20Abi, + functionName: 'symbol' as const, + })), + allowFailure: true, + }), + ]); + + const bytes32FallbackTokens: TokenAddressInput[] = []; + + for (const [index, token] of tokenBatch.entries()) { + const key = infoToKey(token.address, chainId); + const metadata = getOrCreateTokenMetadata(metadataByToken, key); + const decimalsResult = decimalsResults[index]; + const symbolResult = symbolResults[index]; + + if (decimalsResult.status === 'success' && decimalsResult.result !== undefined) { + metadata.decimals = Number(decimalsResult.result); + } + + const symbol = symbolResult.status === 'success' ? normalizeTokenSymbol(symbolResult.result) : undefined; + + if (symbol) { + metadata.symbol = symbol; + continue; + } + + bytes32FallbackTokens.push(token); + } + + if (bytes32FallbackTokens.length === 0) { + continue; + } + + const bytes32SymbolResults = await client.multicall({ + contracts: bytes32FallbackTokens.map((token) => ({ address: token.address as Address, - abi: erc20Abi, - functionName: 'decimals' as const, + abi: erc20SymbolBytes32Abi, + functionName: 'symbol' as const, })), allowFailure: true, }); - for (const [index, result] of results.entries()) { + for (const [index, result] of bytes32SymbolResults.entries()) { if (result.status !== 'success' || result.result === undefined) { continue; } - decimalsByToken.set(infoToKey(tokenBatch[index].address, chainId), Number(result.result)); + const token = bytes32FallbackTokens[index]; + const key = infoToKey(token.address, chainId); + const symbol = decodeBytes32Symbol(result.result); + + if (!symbol) { + continue; + } + + getOrCreateTokenMetadata(metadataByToken, key).symbol = symbol; } } } + return metadataByToken; +}; + +export const fetchTokenDecimalsMap = async (tokens: TokenAddressInput[]): Promise> => { + const uniqueTokens = dedupeTokenInputs(tokens); + const decimalsByToken = new Map(); + const unresolvedTokens: TokenAddressInput[] = []; + + for (const token of uniqueTokens) { + const knownToken = findToken(token.address, token.chainId); + + if (knownToken) { + decimalsByToken.set(infoToKey(token.address, token.chainId), knownToken.decimals); + continue; + } + + unresolvedTokens.push(token); + } + + const onchainMetadataByToken = await fetchOnchainTokenMetadataMap(unresolvedTokens); + + for (const token of unresolvedTokens) { + const decimals = onchainMetadataByToken.get(infoToKey(token.address, token.chainId))?.decimals; + if (decimals !== undefined) { + decimalsByToken.set(infoToKey(token.address, token.chainId), decimals); + } + } + return decimalsByToken; }; export const resolveTokenInfos = async (tokens: TokenAddressInput[]): Promise> => { const uniqueTokens = dedupeTokenInputs(tokens); - const decimalsByToken = await fetchTokenDecimalsMap(uniqueTokens); const resolvedTokenInfos = new Map(); + const unresolvedTokens = uniqueTokens.filter((token) => !findToken(token.address, token.chainId)); + const onchainMetadataByToken = await fetchOnchainTokenMetadataMap(unresolvedTokens); for (const token of uniqueTokens) { const key = infoToKey(token.address, token.chainId); @@ -108,18 +227,19 @@ export const resolveTokenInfos = async (tokens: TokenAddressInput[]): Promise