-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(frontend): frontend docker improving #175
Conversation
WalkthroughThis pull request introduces a new Dockerfile for a Node.js application and updates container management by introducing base image verification and conditional building. The changes also implement enhanced service readiness checks in the frontend, adding periodic retries and dynamic UI feedback. These updates affect Docker configuration, API route handling for container creation, and the web-view component responsible for loading previews. Changes
Sequence Diagram(s)sequenceDiagram
participant Dev as Developer
participant App as runDockerContainer Function
participant Checker as ensureBaseImageExists
participant Docker as Docker Engine
Dev->>App: Trigger container run
App->>Checker: Check if base image exists
Checker->>Docker: Execute docker command to check image
Docker-->>Checker: Return image existence status
alt Image not found
Checker->>Docker: Build base image
Docker-->>Checker: Confirm image built
end
App->>Docker: Run container with resource limits using base image
Docker-->>App: Container started
sequenceDiagram
participant User as User
participant UI as PreviewContent Component
participant Checker as Service Checker
participant Service as Frontend Service
User->>UI: Request preview view
UI->>Checker: Start service readiness check
Checker->>Service: Send HEAD request for readiness
Service-->>Checker: Responds (ready/not ready)
alt Service not ready
Checker->>UI: Update loading state & schedule retry
UI->>Checker: Retry check (up to 15 attempts)
else Service ready
Checker->>UI: Set service as ready
UI->>User: Display preview via iframe
end
Poem
Warning There were issues while running some tools. Please review the errors and either fix the tool’s configuration or disable the tool if it’s a critical failure. 🔧 ESLint
frontend/src/app/api/runProject/route.tsOops! Something went wrong! :( ESLint: 8.57.1 ESLint couldn't find the config "next/core-web-vitals" to extend from. Please check that the name of the config is correct. The config "next/core-web-vitals" was referenced from the config file in "/frontend/.eslintrc.json". If you still have problems, please stop by https://eslint.org/chat/help to chat with the team. frontend/src/components/chat/code-engine/web-view.tsxOops! Something went wrong! :( ESLint: 8.57.1 ESLint couldn't find the config "next/core-web-vitals" to extend from. Please check that the name of the config is correct. The config "next/core-web-vitals" was referenced from the config file in "/frontend/.eslintrc.json". If you still have problems, please stop by https://eslint.org/chat/help to chat with the team. ✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
🧹 Nitpick comments (2)
docker/project-base-image/Dockerfile (1)
17-19
: Confirm development vs. production usage.Currently, the
CMD
installs dev dependencies on container start, which is more typical for development than production. If you intend to use this image in production, consider installing dependencies at build time for faster container startup and smaller image sizes.frontend/src/app/api/runProject/route.ts (1)
449-449
: Optional: Enhance error context.Including the project path or container name in the error log can ease troubleshooting.
-console.error(`Error running container:`, error); +console.error(`Error running container for project ${projectPath}:`, error);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
docker/project-base-image/Dockerfile
(1 hunks)frontend/src/app/api/runProject/route.ts
(12 hunks)frontend/src/components/chat/code-engine/web-view.tsx
(14 hunks)
🔇 Additional comments (11)
frontend/src/app/api/runProject/route.ts (5)
13-14
: Looks good!Defining a single constant for the base image name is a neat approach for maintainability.
198-211
: Fine approach to checking base image presence.
checkBaseImageExists
is straightforward and properly checks for errors. Ensure that downstream calls handle any transient Docker issues or race conditions.
313-315
: Renamed function improves clarity.
runDockerContainer
is more descriptive thanbuildAndRunDocker
since the base image is handled separately.
355-356
: Ensuring base image existence is essential.Good call to block container creation until the base image is available.
389-390
: Resource limits look appropriate.Imposing memory and CPU constraints can protect the host from runaway processes.
Also applies to: 404-405
frontend/src/components/chat/code-engine/web-view.tsx (6)
28-31
: Good use of separate state variables.These states (
isServiceReady
,serviceCheckAttempts
, etc.) make the readiness logic more transparent.
40-70
: Service readiness check is concise.The timeout +
HEAD
request strategy is effective. Consider differentiating between different error statuses if needed.
178-180
: Loading flow improvements look correct.Resetting readiness state on project change and setting user-friendly error messages makes the preview more robust.
Also applies to: 182-185, 202-205
213-217
: Conditional iframe loading & reload logic are well-handled.Only assigning
src
if the service is ready prevents broken iframes. Reloading resets checks properly.Also applies to: 258-262
294-295
: Disabling controls while service is not ready is a good UX.Preventing navigation and zoom until the service is available avoids confusing errors.
Also applies to: 303-305, 314-315, 328-329, 339-340, 348-349, 357-358, 366-367
375-411
: Dynamic loading UI is well-structured.The spinner, loading message, and retry button provide clear feedback. Just ensure timers are fully cleaned up on unmount.
# Pre-install common frontend dependencies to speed up project startup | ||
RUN npm install -g npm@latest vite@latest |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Use pinned versions for npm and Vite.
Installing npm@latest
and vite@latest
globally can lead to unpredictable builds if these packages publish breaking updates. Consider pinning specific versions or referencing a lockfile for more consistent and reproducible builds.
-RUN npm install -g npm@latest vite@latest
+RUN npm install -g npm@9.8.1 vite@4.3.9
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
# Pre-install common frontend dependencies to speed up project startup | |
RUN npm install -g npm@latest vite@latest | |
# Pre-install common frontend dependencies to speed up project startup | |
RUN npm install -g npm@9.8.1 vite@4.3.9 |
// Flag to track if base image has been built | ||
let baseImageBuilt = false; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Address potential concurrency issue with baseImageBuilt
.
Because baseImageBuilt
is a simple boolean, multiple requests arriving simultaneously may each try to build the image before it’s set to true. This can cause redundant builds or race conditions.
Also applies to: 91-92
/** | ||
* Remove node_modules and lock files | ||
* Build base image if it doesn't exist |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Serialize base image builds to avoid race conditions.
Multiple concurrent calls to ensureBaseImageExists
could block or conflict. One approach is to maintain a building “lock” or queue:
-let baseImageBuilt = false;
+let baseImageBuilding = false;
+let baseImagePromise: Promise<void> | null = null;
async function ensureBaseImageExists(): Promise<void> {
if (baseImageBuilt) return;
+ if (baseImageBuilding && baseImagePromise) {
+ // If a build is already in progress, wait for that promise
+ await baseImagePromise;
+ return;
+ }
+ baseImageBuilding = true;
+ baseImagePromise = (async () => {
try {
...
baseImageBuilt = true;
} finally {
+ baseImageBuilding = false;
+ }
+ })();
+ await baseImagePromise;
}
Also applies to: 238-272
const serviceCheckTimerRef = useRef<NodeJS.Timeout | null>(null); | ||
const MAX_CHECK_ATTEMPTS = 15; // Reduced max attempts since we have progressive intervals | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Watch for fallback timer cleanup.
serviceCheckTimerRef
may not clear all timers, such as the 45-second fallback timer. This could lead to a memory leak if the user navigates away before it's triggered.
// Function to periodically check service readiness | ||
const startServiceReadyCheck = async (url: string) => { | ||
// Clear any existing timer | ||
if (serviceCheckTimerRef.current) { | ||
clearInterval(serviceCheckTimerRef.current); | ||
} | ||
|
||
setServiceCheckAttempts(0); | ||
setIsServiceReady(false); | ||
setLoadingMessage('Loading preview...'); | ||
|
||
// Try immediately first (don't wait for interval) | ||
const initialReady = await checkServiceReady(url); | ||
if (initialReady) { | ||
console.log('Frontend service is ready immediately!'); | ||
setIsServiceReady(true); | ||
return; // Exit early if service is ready immediately | ||
} | ||
|
||
// Progressive check intervals (check more frequently at first) | ||
const checkIntervals = [500, 1000, 1000, 1500, 1500]; // First few checks are faster | ||
let checkIndex = 0; | ||
|
||
// Set a fallback timer - show preview after 45 seconds no matter what | ||
const fallbackTimer = setTimeout(() => { | ||
console.log('Fallback timer triggered - showing preview anyway'); | ||
setIsServiceReady(true); | ||
if (serviceCheckTimerRef.current) { | ||
clearInterval(serviceCheckTimerRef.current); | ||
serviceCheckTimerRef.current = null; | ||
} | ||
}, 45000); | ||
|
||
const runServiceCheck = async () => { | ||
setServiceCheckAttempts((prev) => prev + 1); | ||
|
||
// Update loading message with attempts | ||
if (serviceCheckAttempts > 3) { | ||
setLoadingMessage( | ||
`Starting frontend service... (${serviceCheckAttempts}/${MAX_CHECK_ATTEMPTS})` | ||
); | ||
} | ||
|
||
const ready = await checkServiceReady(url); | ||
|
||
if (ready) { | ||
console.log('Frontend service is ready!'); | ||
setIsServiceReady(true); | ||
clearTimeout(fallbackTimer); | ||
if (serviceCheckTimerRef.current) { | ||
clearInterval(serviceCheckTimerRef.current); | ||
serviceCheckTimerRef.current = null; | ||
} | ||
} else if (serviceCheckAttempts >= MAX_CHECK_ATTEMPTS) { | ||
// Service didn't become ready after max attempts | ||
console.log( | ||
'Max attempts reached. Service might still be initializing.' | ||
); | ||
setLoadingMessage( | ||
'Preview might not be fully loaded. Click refresh to try again.' | ||
); | ||
|
||
// Show the preview anyway after max attempts | ||
setIsServiceReady(true); | ||
clearTimeout(fallbackTimer); | ||
|
||
if (serviceCheckTimerRef.current) { | ||
clearInterval(serviceCheckTimerRef.current); | ||
serviceCheckTimerRef.current = null; | ||
} | ||
} else { | ||
// Schedule next check with dynamic interval | ||
const nextInterval = | ||
checkIndex < checkIntervals.length | ||
? checkIntervals[checkIndex++] | ||
: 2000; // Default to 2000ms after initial fast checks | ||
|
||
setTimeout(runServiceCheck, nextInterval); | ||
} | ||
}; | ||
|
||
// Start the first check | ||
setTimeout(runServiceCheck, 500); | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Possible fallback timer leak.
In startServiceReadyCheck
, a 45-second fallback timer is created (line 96) but not stored in serviceCheckTimerRef
, meaning it won’t be cleared by the component’s cleanup effect.
- const fallbackTimer = setTimeout(() => {
+ serviceCheckTimerRef.current = setTimeout(() => {
console.log('Fallback timer triggered - showing preview anyway');
setIsServiceReady(true);
...
}, 45000);
Committable suggestion skipped: line range outside the PR's diff.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Summary by CodeRabbit