feat: Add support for invoking Cloud Function with multipart/form-data protocol#10395
feat: Add support for invoking Cloud Function with multipart/form-data protocol#10395mtrezza wants to merge 8 commits intoparse-community:alphafrom
multipart/form-data protocol#10395Conversation
|
🚀 Thanks for opening this pull request! We appreciate your effort in improving the project. Please let us know once your pull request is ready for review. Tip
Note Please respond to review comments from AI agents just like you would to comments from a human reviewer. Let the reviewer resolve their own comments, unless they have reviewed and accepted your commit, or agreed with your explanation for why the feedback was incorrect. Caution Pull requests must be written using an AI agent with human supervision. Pull requests written entirely by a human will likely be rejected, because of lower code quality, higher review effort and the higher risk of introducing bugs. Please note that AI review comments on this pull request alone do not satisfy this requirement. |
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
📝 WalkthroughWalkthroughAdds multipart/form-data support for Cloud Code function invocations: skips JSON parsing for multipart requests, parses multipart bodies with Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant Express as "Express/Router"
participant JSONParser as "JSON Parser"
participant MultipartMW as "Multipart Middleware"
participant Busboy as "@fastify/busboy"
participant CloudFunc as "Cloud Function Handler"
participant Response
Client->>Express: POST /functions/:name (multipart/form-data)
Express->>JSONParser: should parse body?
JSONParser-->>Express: skip (is multipart)
Express->>MultipartMW: run multipartMiddleware
MultipartMW->>Busboy: stream request -> parse parts
Busboy-->>MultipartMW: emit field/file events
MultipartMW->>MultipartMW: accumulate bytes, enforce maxUploadSize
MultipartMW->>MultipartMW: construct `req.body` (text + files as Buffers)
MultipartMW-->>Express: next()
Express->>CloudFunc: handleCloudFunction(req)
CloudFunc->>CloudFunc: parse params (Buffer passthrough)
CloudFunc-->>Response: result
Response-->>Client: HTTP response
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
🚥 Pre-merge checks | ✅ 3 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
multipart/form-data
multipart/form-datamultipart/form-data protocol
multipart/form-data protocol multipart/form-data protocol
There was a problem hiding this comment.
🧹 Nitpick comments (4)
package.json (1)
1-173: Consider adding scope to PR title for better changelog clarity.The current PR title follows Angular commit convention. To make it more specific for developers scanning the changelog, consider adding the
functionsscope:feat(functions): Add multipart/form-data support for Cloud Function invocationThis makes it immediately clear which subsystem is affected. Based on learnings, suggesting PR titles that make meaningful changelog entries for developers.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@package.json` around lines 1 - 173, Update the PR title to include the "functions" scope so the changelog is clearer; change the current title to: "feat(functions): Add multipart/form-data support for Cloud Function invocation" and ensure any related commit messages or squash commit used for the merge also use this scoped title.src/Routers/FunctionsRouter.js (3)
230-235: Add braces to single-lineifstatement per linting rules.🔧 Proposed fix
busboy.on('finish', () => { - if (settled) return; + if (settled) { + return; + } settled = true; req.body = fields; resolve(); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/Routers/FunctionsRouter.js` around lines 230 - 235, The single-line conditional inside the busboy 'finish' handler uses an unbraced if (settled) return; which violates linting rules; update the handler in the busboy.on('finish', ...) callback (the code referencing settled, req.body = fields, and resolve()) to use a braced conditional (e.g., if (settled) { return; }) so the early-return is explicitly scoped and consistent with lint rules.
221-228: Add braces to single-lineifstatement per linting rules.🔧 Proposed fix
stream.on('end', () => { - if (settled) return; + if (settled) { + return; + } fields[name] = { filename, contentType: mimeType || 'application/octet-stream', data: Buffer.concat(chunks), }; });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/Routers/FunctionsRouter.js` around lines 221 - 228, The single-line if inside the stream.on('end', ...) callback violates lint rules; update the callback so the conditional uses braces: replace "if (settled) return;" with a block-form conditional (e.g., "if (settled) { return; }") within the stream.on('end' ...) handler where variables like settled, fields, name, filename, mimeType, chunks are used to populate fields[name].
188-193: Add braces to single-lineifstatement per linting rules.The static analysis tool flagged the missing braces on the early return.
🔧 Proposed fix
const safeReject = (err) => { - if (settled) return; + if (settled) { + return; + } settled = true; busboy.destroy(); reject(err); };🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/Routers/FunctionsRouter.js` around lines 188 - 193, The single-line early-return inside the safeReject function lacks braces (if (settled) return;) which violates lint rules; update safeReject to use a braced if block (if (settled) { return; }) so the guard is clear and consistent, keeping the rest of the function (setting settled = true, busboy.destroy(), reject(err)) unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@package.json`:
- Around line 1-173: Update the PR title to include the "functions" scope so the
changelog is clearer; change the current title to: "feat(functions): Add
multipart/form-data support for Cloud Function invocation" and ensure any
related commit messages or squash commit used for the merge also use this scoped
title.
In `@src/Routers/FunctionsRouter.js`:
- Around line 230-235: The single-line conditional inside the busboy 'finish'
handler uses an unbraced if (settled) return; which violates linting rules;
update the handler in the busboy.on('finish', ...) callback (the code
referencing settled, req.body = fields, and resolve()) to use a braced
conditional (e.g., if (settled) { return; }) so the early-return is explicitly
scoped and consistent with lint rules.
- Around line 221-228: The single-line if inside the stream.on('end', ...)
callback violates lint rules; update the callback so the conditional uses
braces: replace "if (settled) return;" with a block-form conditional (e.g., "if
(settled) { return; }") within the stream.on('end' ...) handler where variables
like settled, fields, name, filename, mimeType, chunks are used to populate
fields[name].
- Around line 188-193: The single-line early-return inside the safeReject
function lacks braces (if (settled) return;) which violates lint rules; update
safeReject to use a braced if block (if (settled) { return; }) so the guard is
clear and consistent, keeping the rest of the function (setting settled = true,
busboy.destroy(), reject(err)) unchanged.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 770506d5-a803-4582-9641-a4ff9aee8506
📒 Files selected for processing (5)
package-lock.jsonpackage.jsonspec/CloudCodeMultipart.spec.jssrc/ParseServer.tssrc/Routers/FunctionsRouter.js
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## alpha #10395 +/- ##
==========================================
- Coverage 92.52% 92.51% -0.01%
==========================================
Files 192 192
Lines 16663 16709 +46
Branches 229 229
==========================================
+ Hits 15417 15459 +42
- Misses 1224 1228 +4
Partials 22 22 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (1)
src/Routers/FunctionsRouter.js (1)
186-186: Normalize invalidmaxUploadSizevalues here.Line 186 is the first place the functions path parses
req.config.maxUploadSize, so a bad config bubbles out as the rawparseSizeToBytesfailure. Consider validating this once at startup or mirroring the guard already used on the file-upload path so multipart calls fail with a clearer error.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/Routers/FunctionsRouter.js` at line 186, The code currently calls Utils.parseSizeToBytes(req.config.maxUploadSize) directly in FunctionsRouter (producing const maxBytes) which lets bad config values throw raw errors; update this to validate/normalize req.config.maxUploadSize before parsing (or catch parse errors) and mirror the existing file-upload guard behavior: if parse fails or yields a non-positive value, return a clear HTTP error (or fall back to a sensible default) with a descriptive message referencing req.config.maxUploadSize, and ensure Utils.parseSizeToBytes usage around maxBytes is wrapped in a try/validation branch so multipart function uploads fail with that clearer error instead of bubbling the raw exception.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/Routers/FunctionsRouter.js`:
- Around line 193-218: The Busboy instance is not configured to respect Parse
Server's maxUploadSize and the field handler doesn't check truncation flags, so
text fields can be silently truncated; when creating Busboy (where you call
Busboy({ headers: req.headers })) pass limits that wire fieldSize (and
optionally fieldNameSize) to the same maxUploadSize value, and inside the
busboy.on('field', (name, value, info) => ...) handler check
info.fieldnameTruncated or info.valueTruncated (or valTruncated) and call
safeReject with a Parse.Error(Parse.Error.OBJECT_TOO_LARGE, 'Multipart request
exceeds maximum upload size.') if truncation occurred, preserving the existing
totalBytes check and use of safeReject/settled.
- Around line 34-35: parseParams currently returns Buffer objects verbatim
(Buffer.isBuffer(obj) branch), which lets JSON.stringify serialize full byte
arrays before truncateLogMessage runs; update parseParams in FunctionsRouter.js
to detect Buffer instances (and any nested Buffers) and replace them with a
safe/redacted placeholder (e.g., "[REDACTED_BUFFER]" or an object with only the
size/metadata) so that when params are later JSON.stringify'd (see where params
is logged around the previous params logging calls) large uploads from the
multipart path (the { filename, contentType, data: Buffer } objects handled
around the parseParams call) do not cause full byte-array serialization; ensure
the Buffer redaction is applied recursively in parseParams so nested Buffers are
also replaced.
---
Nitpick comments:
In `@src/Routers/FunctionsRouter.js`:
- Line 186: The code currently calls
Utils.parseSizeToBytes(req.config.maxUploadSize) directly in FunctionsRouter
(producing const maxBytes) which lets bad config values throw raw errors; update
this to validate/normalize req.config.maxUploadSize before parsing (or catch
parse errors) and mirror the existing file-upload guard behavior: if parse fails
or yields a non-positive value, return a clear HTTP error (or fall back to a
sensible default) with a descriptive message referencing
req.config.maxUploadSize, and ensure Utils.parseSizeToBytes usage around
maxBytes is wrapped in a try/validation branch so multipart function uploads
fail with that clearer error instead of bubbling the raw exception.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 588dd585-5d6e-499d-9d2e-15ae7440727a
📒 Files selected for processing (5)
package-lock.jsonpackage.jsonspec/CloudCodeMultipart.spec.jssrc/Routers/FunctionsRouter.jssrc/cloud-code/Parse.Cloud.js
✅ Files skipped from review due to trivial changes (3)
- package-lock.json
- package.json
- src/cloud-code/Parse.Cloud.js
| } else if (Buffer.isBuffer(obj)) { | ||
| return obj; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
wc -l src/Routers/FunctionsRouter.jsRepository: parse-community/parse-server
Length of output: 106
🏁 Script executed:
cat -n src/Routers/FunctionsRouter.js | sed -n '1,50p'Repository: parse-community/parse-server
Length of output: 1893
🏁 Script executed:
cat -n src/Routers/FunctionsRouter.js | sed -n '230,250p'Repository: parse-community/parse-server
Length of output: 703
🏁 Script executed:
cat -n src/Routers/FunctionsRouter.js | sed -n '285,320p'Repository: parse-community/parse-server
Length of output: 1859
🏁 Script executed:
cat -n src/Routers/FunctionsRouter.js | sed -n '250,295p'Repository: parse-community/parse-server
Length of output: 2212
🏁 Script executed:
grep -n "parseParams\|params" src/Routers/FunctionsRouter.js | head -30Repository: parse-community/parse-server
Length of output: 846
🏁 Script executed:
git log --oneline -n 20 -- src/Routers/FunctionsRouter.jsRepository: parse-community/parse-server
Length of output: 85
🏁 Script executed:
rg "truncateLogMessage" -A 5 src/Repository: parse-community/parse-server
Length of output: 2827
🏁 Script executed:
git show HEAD:src/Routers/FunctionsRouter.js | sed -n '34,35p'Repository: parse-community/parse-server
Length of output: 124
🏁 Script executed:
git log --all -p -- src/Routers/FunctionsRouter.js | grep -B 5 -A 5 "Buffer.isBuffer" | head -30Repository: parse-community/parse-server
Length of output: 376
🏁 Script executed:
git show HEAD~1:src/Routers/FunctionsRouter.js 2>/dev/null | sed -n '238,242p' || echo "File may not have existed"Repository: parse-community/parse-server
Length of output: 54
🏁 Script executed:
cat -n src/Controllers/LoggerController.js | sed -n '1,50p'Repository: parse-community/parse-server
Length of output: 1889
🏁 Script executed:
cat -n src/Controllers/LoggerController.js | sed -n '50,120p'Repository: parse-community/parse-server
Length of output: 2260
🏁 Script executed:
git log -1 --oneline -- src/Routers/FunctionsRouter.jsRepository: parse-community/parse-server
Length of output: 85
🏁 Script executed:
git diff HEAD~1 HEAD -- src/Routers/FunctionsRouter.js | head -100Repository: parse-community/parse-server
Length of output: 100
🏁 Script executed:
git log --oneline --all -- src/Routers/FunctionsRouter.jsRepository: parse-community/parse-server
Length of output: 2422
🏁 Script executed:
node -e "const buf = Buffer.from('hello'); console.log(JSON.stringify({data: buf}))"Repository: parse-community/parse-server
Length of output: 127
🏁 Script executed:
node -e "const buf = Buffer.from('hello world this is a test buffer with some content'); console.log('Length of stringified:', JSON.stringify({data: buf}).length); console.log('Full stringify:', JSON.stringify({data: buf}))"Repository: parse-community/parse-server
Length of output: 342
Redact Buffer objects from cloud function logging to prevent full byte array serialization.
When multipart file uploads create { filename, contentType, data: Buffer } objects (lines 238-242), these flow through parseParams (which preserves Buffers at lines 34-35) into the params logged at lines 292 and 311. The JSON.stringify(params) calls serialize the entire Buffer byte array before truncateLogMessage is applied, causing unnecessary memory allocation. Node's Buffer.toJSON() emits all bytes as an array, so large file uploads can trigger megabytes of unwanted serialization even though truncation prevents logging output.
Filter out or redact Buffer objects before JSON.stringify.
Also applies to: 238-242
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/Routers/FunctionsRouter.js` around lines 34 - 35, parseParams currently
returns Buffer objects verbatim (Buffer.isBuffer(obj) branch), which lets
JSON.stringify serialize full byte arrays before truncateLogMessage runs; update
parseParams in FunctionsRouter.js to detect Buffer instances (and any nested
Buffers) and replace them with a safe/redacted placeholder (e.g.,
"[REDACTED_BUFFER]" or an object with only the size/metadata) so that when
params are later JSON.stringify'd (see where params is logged around the
previous params logging calls) large uploads from the multipart path (the {
filename, contentType, data: Buffer } objects handled around the parseParams
call) do not cause full byte-array serialization; ensure the Buffer redaction is
applied recursively in parseParams so nested Buffers are also replaced.
| busboy = Busboy({ headers: req.headers }); | ||
| } catch (err) { | ||
| return reject( | ||
| new Parse.Error(Parse.Error.INVALID_JSON, `Invalid multipart request: ${err.message}`) | ||
| ); | ||
| } | ||
| const safeReject = (err) => { | ||
| if (settled) { | ||
| return; | ||
| } | ||
| settled = true; | ||
| busboy.destroy(); | ||
| reject(err); | ||
| }; | ||
| busboy.on('field', (name, value) => { | ||
| totalBytes += Buffer.byteLength(value); | ||
| if (totalBytes > maxBytes) { | ||
| return safeReject( | ||
| new Parse.Error( | ||
| Parse.Error.OBJECT_TOO_LARGE, | ||
| 'Multipart request exceeds maximum upload size.' | ||
| ) | ||
| ); | ||
| } | ||
| fields[name] = value; | ||
| }); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's examine the actual file to understand the code context
cd /tmp && git ls-files | grep -E "(FunctionsRouter|package\.json)" | head -20Repository: parse-community/parse-server
Length of output: 140
🏁 Script executed:
# Check the package.json to see the busboy version
cat package.json | grep -A2 -B2 busboyRepository: parse-community/parse-server
Length of output: 253
🏁 Script executed:
# Examine the FunctionsRouter.js file around the mentioned lines
wc -l src/Routers/FunctionsRouter.jsRepository: parse-community/parse-server
Length of output: 106
🏁 Script executed:
# Read the relevant section of the file
sed -n '180,230p' src/Routers/FunctionsRouter.jsRepository: parse-community/parse-server
Length of output: 1732
🌐 Web query:
@fastify/busboy default fieldSize fieldNameSize parameters
💡 Result:
The @fastify/busboy package, a Fastify-optimized fork of busboy for parsing HTML form data, has the following default values for the specified limit parameters in its limits configuration object: - fieldSize: 1 MiB (1,048,576 bytes, or 1024 x 1024 bytes) - fieldNameSize: 100 bytes These defaults apply when no custom limits are provided to the Busboy constructor. They limit the maximum size of non-file field values and field names, respectively, to prevent excessive memory usage or denial-of-service attacks.
Citations:
🌐 Web query:
@fastify/busboy field callback signature truncation flags parameters
💡 Result:
The 'field' event callback in @fastify/busboy has the signature: (fieldname, value, fieldnameTruncated, valTruncated, transferEncoding, mimeType). fieldname: string - The name of the field. value: string - The field value. fieldnameTruncated: boolean - True if the field name exceeded the configured fieldNameSize limit. valTruncated: boolean - True if the field value exceeded the configured fieldSize limit. transferEncoding: string - The Content-Transfer-Encoding value (e.g., 'binary', 'quoted-printable'). mimeType: string - The Content-Type value of the field. These truncation flags indicate when limits are hit due to configuration options like fieldNameSize and fieldSize in the Busboy constructor. The encoding and mimeType parameters provide additional metadata from the multipart field headers.
Citations:
🌐 Web query:
busboy field event callback truncated valueTruncated keyTruncated
💡 Result:
In the Busboy Node.js library for parsing multipart/form-data (used for file uploads), the 'field' event is emitted for each non-file form field. The callback receives: fieldname (string), value (string), and info (object). The info object contains: - nameTruncated (boolean): true if the field name was truncated due to limits.fieldNameSize - valueTruncated (boolean): true if the field value was truncated due to limits.fieldSize - encoding (string): Content-Transfer-Encoding - mimeType (string): Content-Type These truncation flags (matching the query terms valueTruncated and keyTruncated/nameTruncated) allow detection of oversized field names/values per configured limits. Example usage: bb.on('field', (name, val, info) => { if (info.valueTruncated) { console.log('Field value truncated'); } if (info.nameTruncated) { console.log('Field name truncated'); } });
Citations:
- 1: http://github.com/mscdex/busboy
- 2: https://www.npmjs.com/package/busboy
- 3: FEATURE REQUEST: Emit event when
limits.fileSizeorlimits.fieldSizeis reached mscdex/busboy#307
🏁 Script executed:
# Check if there are other field event handlers in the file
rg "\.on\(['\"](field|file)" src/Routers/FunctionsRouter.js -A 2Repository: parse-community/parse-server
Length of output: 349
🏁 Script executed:
# Also search for any truncated checks in the entire file
rg "truncated|Truncated" src/Routers/FunctionsRouter.jsRepository: parse-community/parse-server
Length of output: 54
Wire Busboy's field limits to maxUploadSize.
@fastify/busboy defaults non-file fields to a 1 MiB fieldSize and 100-byte fieldNameSize limits. When field values exceed these limits, the library truncates them and signals truncation through the fieldnameTruncated and valTruncated flags passed to the field callback. Since this code doesn't configure these limits to match Parse Server's maxUploadSize and doesn't check the truncation flags, multipart text fields can be silently accepted with truncated values while the overall request still passes the size limit check.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/Routers/FunctionsRouter.js` around lines 193 - 218, The Busboy instance
is not configured to respect Parse Server's maxUploadSize and the field handler
doesn't check truncation flags, so text fields can be silently truncated; when
creating Busboy (where you call Busboy({ headers: req.headers })) pass limits
that wire fieldSize (and optionally fieldNameSize) to the same maxUploadSize
value, and inside the busboy.on('field', (name, value, info) => ...) handler
check info.fieldnameTruncated or info.valueTruncated (or valTruncated) and call
safeReject with a Parse.Error(Parse.Error.OBJECT_TOO_LARGE, 'Multipart request
exceeds maximum upload size.') if truncation occurred, preserving the existing
totalBytes check and use of safeReject/settled.
Pull Request
Issue
Cloud Code Functions only accept JSON payloads. Developers who need to send files to a Cloud Function must first upload the file separately, then call the function referencing the uploaded file.
Approach
Add
multipart/form-datasupport toPOST /functions/:functionName. A route-level middleware onFunctionsRouterdetects the content type and parses the body with@fastify/busboy. Text fields arrive as strings, file fields arrive as{ filename, contentType, data: Buffer }. All fields merge flat intorequest.params.Tasks
Summary by CodeRabbit
New Features
Tests
Chores
Documentation