Conversation
Its getting messy and only going to get worse, so let's split them up. This also moves the non-composables into `utils`.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
2 Skipped Deployments
|
📝 WalkthroughWalkthroughThis pull request removes the monolithic 🚥 Pre-merge checks | ✅ 1✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 6
🧹 Nitpick comments (5)
app/utils/npm/api.ts (1)
45-64: Consider handling rejected promises in the cache.The caching pattern stores the promise immediately, which prevents duplicate in-flight requests. However, if
getVersionsfails, the rejected promise remains cached, causing all subsequent calls for that package to return the same error without retrying.♻️ Proposed fix to handle failed requests
export async function fetchAllPackageVersions(packageName: string): Promise<PackageVersionInfo[]> { const cached = allVersionsCache.get(packageName) if (cached) return cached const promise = (async () => { const data = await getVersions(packageName, { metadata: true }) return Object.entries(data.versionsMeta) .map(([version, meta]) => ({ version, time: meta.time, hasProvenance: meta.provenance === 'trustedPublisher' || meta.provenance === true, deprecated: meta.deprecated, })) .sort((a, b) => compare(b.version, a.version)) - })() + })().catch((error) => { + allVersionsCache.delete(packageName) + throw error + }) allVersionsCache.set(packageName, promise) return promise }app/composables/npm/useOutdatedDependencies.ts (1)
123-129: Consider error handling for the watch callback.The async
fetchOutdatedInfois called withoutawaitor.catch(). If an unhandled rejection occurs, it won't be caught. Consider adding error handling.♻️ Suggested improvement
watch( () => toValue(dependencies), deps => { - fetchOutdatedInfo(deps) + fetchOutdatedInfo(deps).catch(err => { + // eslint-disable-next-line no-console + console.error('Failed to fetch outdated info:', err) + }) }, { immediate: true }, )app/composables/npm/useNpmSearch.ts (3)
69-69: Unused variablelastSearch.The variable
lastSearchis declared and used as the default value foruseLazyAsyncDatabut is never assigned a value after initialisation. Consider removing it and usingemptySearchResponsedirectly as the default.♻️ Proposed fix
- // Standard (non-incremental) search implementation - let lastSearch: NpmSearchResponse | undefined = undefined - const asyncData = useLazyAsyncData( () => `search:incremental:${toValue(query)}`, async (_nuxtApp, { signal }) => { // ... handler code ... }, - { default: () => lastSearch || emptySearchResponse }, + { default: () => emptySearchResponse }, )
182-186: Missingsignalparameter infetchMorefetch call.The
cachedFetchinfetchMoredoesn't receive a signal, so the request cannot be cancelled if the component unmounts or the query changes during the fetch. Consider passing a signal or implementing cancellation logic.♻️ Suggested improvement
Consider using an
AbortControllerfor cancellation support:+ const fetchMoreController = shallowRef<AbortController | null>(null) + async function fetchMore(targetSize: number): Promise<void> { + // Cancel any in-flight fetchMore request + fetchMoreController.value?.abort() + const controller = new AbortController() + fetchMoreController.value = controller + // ... existing code ... const { data: response } = await cachedFetch<NpmSearchResponse>( `${NPM_REGISTRY}/-/v1/search?${params.toString()}`, - {}, + { signal: controller.signal }, 60, )
206-211: RecursivefetchMorecall may cause stack overflow for very large result sets.If there's a significant gap between current count and target size, the recursive call pattern could cause issues. Consider converting to an iterative approach.
♻️ Iterative alternative
- // If we still need more, fetch again recursively - if ( - cache.value.objects.length < targetSize && - cache.value.objects.length < cache.value.total - ) { - await fetchMore(targetSize) - } + // Continue fetching in a loop instead of recursion + while ( + cache.value && + cache.value.objects.length < targetSize && + cache.value.objects.length < cache.value.total + ) { + const nextFrom = cache.value.objects.length + const nextSize = Math.min(targetSize - nextFrom, cache.value.total - nextFrom) + // ... fetch and merge logic ... + }
| import type { NpmSearchResponse, NpmSearchResult, MinimalPackument } from '#shared/types' | ||
| import { emptySearchResponse, packumentToSearchResult } from './useNpmSearch' | ||
| import { NPM_REGISTRY, NPM_API } from '~/utils/npm/common' | ||
| import { mapWithConcurrency } from '#shared/utils/async' |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Search for encodePackageName definition and usage
rg -n "encodePackageName" --type=ts --type=tsx --type=vue -B 2 -A 2Repository: npmx-dev/npmx.dev
Length of output: 88
🏁 Script executed:
#!/bin/bash
# Read the useOrgPackages.ts file to check imports and usage
cat -n app/composables/npm/useOrgPackages.ts | head -200Repository: npmx-dev/npmx.dev
Length of output: 6789
🏁 Script executed:
#!/bin/bash
# Search for encodePackageName definition across codebase
rg -n "encodePackageName" --type=ts -B 2 -A 2Repository: npmx-dev/npmx.dev
Length of output: 7380
🏁 Script executed:
#!/bin/bash
# Search for exports of encodePackageName
rg -n "export.*encodePackageName" --type=tsRepository: npmx-dev/npmx.dev
Length of output: 141
Add missing import for encodePackageName.
encodePackageName is used at lines 56 and 126 but is not imported. Add the following import to the module:
import { encodePackageName } from '#shared/utils/npm'| // Fetch scoped packages in parallel batches (concurrency limit to avoid overwhelming the API) | ||
| // Use Promise.allSettled to not fail on individual errors | ||
| const scopedBatchSize = 20 // Concurrent requests per batch | ||
| for (let i = 0; i < scopedPackages.length; i += scopedBatchSize) { | ||
| const batch = scopedPackages.slice(i, i + scopedBatchSize) | ||
| bulkPromises.push( | ||
| (async () => { | ||
| const results = await Promise.allSettled( | ||
| batch.map(async name => { | ||
| const encoded = encodePackageName(name) | ||
| const data = await $fetch<{ downloads: number }>( | ||
| `${NPM_API}/downloads/point/last-week/${encoded}`, | ||
| ) | ||
| return { name, downloads: data.downloads } | ||
| }), | ||
| ) | ||
| for (const result of results) { | ||
| if (result.status === 'fulfilled' && result.value.downloads !== undefined) { | ||
| downloads.set(result.value.name, result.value.downloads) | ||
| } | ||
| } | ||
| })(), | ||
| ) |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
cat -n app/composables/npm/useOrgPackages.ts | head -80Repository: npmx-dev/npmx.dev
Length of output: 3589
🏁 Script executed:
rg "fetchBulkDownloads" app/composables/npm/useOrgPackages.tsRepository: npmx-dev/npmx.dev
Length of output: 149
Sequential batching required for scoped packages; options not passed to $fetch.
All batches execute immediately in parallel (line 52 pushes IIFEs that start executing), so total concurrency scales with the number of batches × 20, not capped at 20 as intended. This defeats the batching mechanism and can overwhelm the npm API. Additionally, the $fetch call at line 57 does not pass the options parameter (compare line 33 for unscoped packages), so abort signals and custom headers are lost—inconsistent with the unscoped package handling.
Move the await Promise.allSettled() inside the loop (remove the IIFE wrapper) to enforce sequential batch processing, and pass options to the $fetch call:
Sequential batches + pass options
// Fetch scoped packages in parallel batches (concurrency limit to avoid overwhelming the API)
// Use Promise.allSettled to not fail on individual errors
const scopedBatchSize = 20 // Concurrent requests per batch
for (let i = 0; i < scopedPackages.length; i += scopedBatchSize) {
const batch = scopedPackages.slice(i, i + scopedBatchSize)
- bulkPromises.push(
- (async () => {
- const results = await Promise.allSettled(
- batch.map(async name => {
- const encoded = encodePackageName(name)
- const data = await $fetch<{ downloads: number }>(
- `${NPM_API}/downloads/point/last-week/${encoded}`,
- )
- return { name, downloads: data.downloads }
- }),
- )
- for (const result of results) {
- if (result.status === 'fulfilled' && result.value.downloads !== undefined) {
- downloads.set(result.value.name, result.value.downloads)
- }
- }
- })(),
- )
+ const results = await Promise.allSettled(
+ batch.map(async name => {
+ const encoded = encodePackageName(name)
+ const data = await $fetch<{ downloads: number }>(
+ `${NPM_API}/downloads/point/last-week/${encoded}`,
+ options,
+ )
+ return { name, downloads: data.downloads }
+ }),
+ )
+ for (const result of results) {
+ if (result.status === 'fulfilled' && result.value.downloads !== undefined) {
+ downloads.set(result.value.name, result.value.downloads)
+ }
+ }
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Fetch scoped packages in parallel batches (concurrency limit to avoid overwhelming the API) | |
| // Use Promise.allSettled to not fail on individual errors | |
| const scopedBatchSize = 20 // Concurrent requests per batch | |
| for (let i = 0; i < scopedPackages.length; i += scopedBatchSize) { | |
| const batch = scopedPackages.slice(i, i + scopedBatchSize) | |
| bulkPromises.push( | |
| (async () => { | |
| const results = await Promise.allSettled( | |
| batch.map(async name => { | |
| const encoded = encodePackageName(name) | |
| const data = await $fetch<{ downloads: number }>( | |
| `${NPM_API}/downloads/point/last-week/${encoded}`, | |
| ) | |
| return { name, downloads: data.downloads } | |
| }), | |
| ) | |
| for (const result of results) { | |
| if (result.status === 'fulfilled' && result.value.downloads !== undefined) { | |
| downloads.set(result.value.name, result.value.downloads) | |
| } | |
| } | |
| })(), | |
| ) | |
| // Fetch scoped packages in parallel batches (concurrency limit to avoid overwhelming the API) | |
| // Use Promise.allSettled to not fail on individual errors | |
| const scopedBatchSize = 20 // Concurrent requests per batch | |
| for (let i = 0; i < scopedPackages.length; i += scopedBatchSize) { | |
| const batch = scopedPackages.slice(i, i + scopedBatchSize) | |
| const results = await Promise.allSettled( | |
| batch.map(async name => { | |
| const encoded = encodePackageName(name) | |
| const data = await $fetch<{ downloads: number }>( | |
| `${NPM_API}/downloads/point/last-week/${encoded}`, | |
| options, | |
| ) | |
| return { name, downloads: data.downloads } | |
| }), | |
| ) | |
| for (const result of results) { | |
| if (result.status === 'fulfilled' && result.value.downloads !== undefined) { | |
| downloads.set(result.value.name, result.value.downloads) | |
| } | |
| } | |
| } |
| filteredVersions[v] = { | ||
| ...((version?.dist as { attestations?: unknown }) ? { hasProvenance: true } : {}), | ||
| version: version.version, | ||
| deprecated: version.deprecated, | ||
| tags: version.tags as string[], | ||
| } |
There was a problem hiding this comment.
Incorrect check for provenance - always true when dist exists.
The current check (version?.dist as { attestations?: unknown }) will be truthy whenever dist exists, not when attestations exists. This means hasProvenance will be set for all versions with a dist object.
🐛 Proposed fix
filteredVersions[v] = {
- ...((version?.dist as { attestations?: unknown }) ? { hasProvenance: true } : {}),
+ ...((version?.dist as { attestations?: unknown })?.attestations ? { hasProvenance: true } : {}),
version: version.version,
deprecated: version.deprecated,
tags: version.tags as string[],
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| filteredVersions[v] = { | |
| ...((version?.dist as { attestations?: unknown }) ? { hasProvenance: true } : {}), | |
| version: version.version, | |
| deprecated: version.deprecated, | |
| tags: version.tags as string[], | |
| } | |
| filteredVersions[v] = { | |
| ...((version?.dist as { attestations?: unknown })?.attestations ? { hasProvenance: true } : {}), | |
| version: version.version, | |
| deprecated: version.deprecated, | |
| tags: version.tags as string[], | |
| } |
| start: string, | ||
| end: string, | ||
| ): Promise<NpmDownloadsRangeResponse> { | ||
| const encodedName = encodePackageName(packageName) |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check if encodePackageName is exported from shared/utils/npm.ts
rg -n "export.*encodePackageName" --type=ts
# Check Nuxt configuration for auto-import settings
fd -t f "nuxt.config" --exec cat {}
# Also search for the file location to understand the structure
fd -t f "npm.ts" | head -20Repository: npmx-dev/npmx.dev
Length of output: 7175
🏁 Script executed:
#!/bin/bash
# Check the actual app/utils/npm/api.ts file
cat -n app/utils/npm/api.ts
# Search for auto-import configuration files
fd -t f "(app\.config|shared-imports)" --type=ts
# Search for any nuxt config files
fd -t f "nuxt\.config" -x echo {}
# Look for any explicit imports of encodePackageName in app directory
rg -n "encodePackageName" app/ --type=ts -B 5Repository: npmx-dev/npmx.dev
Length of output: 8152
🏁 Script executed:
#!/bin/bash
# Look for shared-imports.d.ts or auto-import declarations
fd -t f "shared-imports"
# Check if there are app.config files
fd -t f "app\.config"
# Look for .nuxt directory or any TypeScript definitions
fd -t f -H "\.d\.ts$" | grep -i "import\|auto" | head -10
# Check if app/utils is configured elsewhere for auto-imports
rg -n "app/utils\|auto.*import" nuxt.config.tsRepository: npmx-dev/npmx.dev
Length of output: 83
🏁 Script executed:
#!/bin/bash
# Check for app/app.config.ts
cat app/app.config.ts 2>/dev/null || echo "File not found"
# Look for any .d.ts files in app/utils or app directory
fd -t f "\.d\.ts" app/
# Check tsconfig.json for path mappings that might enable auto-imports
fd -t f "tsconfig\.json" -x cat {}
# Search for any imports configuration in nuxt.config
rg -A 20 "imports:" nuxt.config.tsRepository: npmx-dev/npmx.dev
Length of output: 806
Add explicit import for encodePackageName or configure auto-import in Nuxt.
The function encodePackageName is used on line 22 but not explicitly imported. Currently, the Nuxt auto-import configuration includes only composables directories (~/composables, ~/composables/*/*.ts) and does not cover shared utilities. Either add an explicit import matching the pattern used elsewhere in the codebase (import { encodePackageName } from '#shared/utils/npm'), or extend the Nuxt configuration to include shared utilities in the auto-import paths.
Its getting messy and only going to get worse, so let's split them up.
This also moves the non-composables into
utils.