Ng File Upload Set Longer Timeout for Requests
A light-weight module that brings Fetch API to Node.js.
Consider supporting us on our Open up Collective:
You might exist looking for the v2 docs
- Motivation
- Features
- Difference from client-side fetch
- Installation
- Loading and configuring the module
- Upgrading
- Common Usage
- Plain text or HTML
- JSON
- Elementary Post
- Postal service with JSON
- Post with form parameters
- Handling exceptions
- Treatment client and server errors
- Treatment cookies
- Advanced Usage
- Streams
- Accessing Headers and other Metadata
- Extract Fix-Cookie Header
- Mail service data using a file
- Request cancellation with AbortSignal
- API
- fetch(url[, options])
- Options
- Default Headers
- Custom Agent
- Custom highWaterMark
- Insecure HTTP Parser
- Class: Request
- new Asking(input[, options])
- Class: Response
- new Response([trunk[, options]])
- response.ok
- response.redirected
- response.type
- Class: Headers
- new Headers([init])
- Interface: Body
- body.body
- body.bodyUsed
- torso.arrayBuffer()
- torso.blob()
- body.formData()
- body.json()
- body.text()
- Class: FetchError
- Class: AbortError
- TypeScript
- Acknowledgement
- Team - Sometime
- License
Motivation
Instead of implementing XMLHttpRequest
in Node.js to run browser-specific Fetch polyfill, why not go from native http
to fetch
API directly? Hence, node-fetch
, minimal code for a window.fetch
compatible API on Node.js runtime.
See Jason Miller'southward isomorphic-unfetch or Leonardo Quixada'southward cross-fetch for isomorphic usage (exports node-fetch
for server-side, whatwg-fetch
for client-side).
Features
- Stay consistent with
window.fetch
API. - Brand conscious merchandise-off when following WHATWG fetch spec and stream spec implementation details, certificate known differences.
- Employ native hope and async functions.
- Use native Node streams for trunk, on both request and response.
- Decode content encoding (gzip/deflate/brotli) properly, and convert string output (such as
res.text()
andres.json()
) to UTF-8 automatically. - Useful extensions such as redirect limit, response size limit, explicit errors for troubleshooting.
Difference from client-side fetch
- Encounter known differences:
- Every bit of v3.x
- As of v2.x
- If you happen to use a missing feature that
window.fetch
offers, experience free to open an issue. - Pull requests are welcomed too!
Installation
Current stable release (three.ten
) requires at to the lowest degree Node.js 12.20.0.
Loading and configuring the module
ES Modules (ESM)
import fetch from 'node-fetch' ;
CommonJS
node-fetch
from v3 is an ESM-only module - you lot are not able to import it with require()
.
If you cannot switch to ESM, please use v2 which remains uniform with CommonJS. Critical bug fixes will go along to be published for v2.
Alternatively, yous tin utilize the async import()
role from CommonJS to load node-fetch
asynchronously:
// modernistic.cjs const fetch = (...args ) => import ( 'node-fetch' ) . then ( ( { default: fetch } ) => fetch (...args ) ) ;
Providing global access
To apply fetch()
without importing it, you can patch the global
object in node:
// fetch-polyfill.js import fetch , { Hulk , blobFrom , blobFromSync , File , fileFrom , fileFromSync , FormData , Headers , Request , Response , } from 'node-fetch' if ( ! globalThis . fetch ) { globalThis . fetch = fetch globalThis . Headers = Headers globalThis . Request = Request globalThis . Response = Response } // index.js import './fetch-polyfill' // ...
Upgrading
Using an old version of node-fetch? Cheque out the following files:
- two.x to 3.ten upgrade guide
- 1.x to ii.x upgrade guide
- Changelog
Common Usage
NOTE: The documentation beneath is upward-to-date with 3.x
releases, if you are using an older version, delight bank check how to upgrade.
Plain text or HTML
import fetch from 'node-fetch' ; const response = await fetch ( 'https://github.com/' ) ; const body = await response . text ( ) ; panel . log ( body ) ;
JSON
import fetch from 'node-fetch' ; const response = await fetch ( 'https://api.github.com/users/github' ) ; const information = await response . json ( ) ; console . log ( data ) ;
Simple Postal service
import fetch from 'node-fetch' ; const response = look fetch ( 'https://httpbin.org/post' , { method: 'POST' , torso: 'a=one' } ) ; const data = await response . json ( ) ; console . log ( data ) ;
Post with JSON
import fetch from 'node-fetch' ; const body = { a: ane } ; const response = look fetch ( 'https://httpbin.org/post' , { method: 'postal service' , trunk: JSON . stringify ( body ) , headers: { 'Content-Blazon': 'application/json' } } ) ; const data = await response . json ( ) ; console . log ( data ) ;
Post with course parameters
URLSearchParams
is bachelor on the global object in Node.js as of v10.0.0. See official documentation for more usage methods.
NOTE: The Content-Blazon
header is only set automatically to x-www-form-urlencoded
when an instance of URLSearchParams
is given as such:
import fetch from 'node-fetch' ; const params = new URLSearchParams ( ) ; params . append ( 'a' , one ) ; const response = await fetch ( 'https://httpbin.org/post' , { method: 'Mail service' , body: params } ) ; const data = await response . json ( ) ; console . log ( data ) ;
Treatment exceptions
Annotation: 3xx-5xx responses are Not exceptions, and should exist handled in then()
, see the next section.
Wrapping the fetch part into a try/grab
block volition catch all exceptions, such every bit errors originating from node cadre libraries, like network errors, and operational errors which are instances of FetchError. See the mistake treatment document for more details.
import fetch from 'node-fetch' ; endeavor { await fetch ( 'https://domain.invalid/' ) ; } catch ( fault ) { console . log ( error ) ; }
Handling client and server errors
Information technology is common to create a helper function to check that the response contains no client (4xx) or server (5xx) fault responses:
import fetch from 'node-fetch' ; class HTTPResponseError extends Error { constructor ( response , ...args ) { super ( `HTTP Error Response: ${ response . condition } ${ response . statusText } ` , ...args ) ; this . response = response ; } } const checkStatus = response => { if ( response . ok ) { // response.status >= 200 && response.status < 300 render response ; } else { throw new HTTPResponseError ( response ) ; } } const response = wait fetch ( 'https://httpbin.org/condition/400' ) ; try { checkStatus ( response ) ; } catch ( fault ) { console . fault ( error ) ; const errorBody = await error . response . text ( ) ; panel . error ( `Error body: ${ errorBody } ` ) ; }
Handling cookies
Cookies are not stored by default. Withal, cookies can exist extracted and passed by manipulating asking and response headers. Come across Extract Set-Cookie Header for details.
Advanced Usage
Streams
The "Node.js way" is to use streams when possible. You can pipage res.body
to another stream. This example uses stream.pipeline to attach stream error handlers and await for the download to complete.
import { createWriteStream } from 'node:fs' ; import { pipeline } from 'node:stream' ; import { promisify } from 'node:util' import fetch from 'node-fetch' ; const streamPipeline = promisify ( pipeline ) ; const response = wait fetch ( 'https://github.githubassets.com/images/modules/logos_page/Octocat.png' ) ; if ( ! response . ok ) throw new Error ( `unexpected response ${ response . statusText } ` ) ; await streamPipeline ( response . torso , createWriteStream ( './octocat.png' ) ) ;
In Node.js 14 you can too use async iterators to read body
; all the same, be careful to catch errors -- the longer a response runs, the more likely it is to encounter an error.
import fetch from 'node-fetch' ; const response = await fetch ( 'https://httpbin.org/stream/3' ) ; try { for await ( const chunk of response . body ) { console . dir ( JSON . parse ( chunk . toString ( ) ) ) ; } } catch ( err ) { panel . error ( err . stack ) ; }
In Node.js 12 you can besides utilise async iterators to read body
; however, async iterators with streams did not mature until Node.js 14, then y'all need to practice some extra piece of work to ensure yous handle errors directly from the stream and look on it response to fully close.
import fetch from 'node-fetch' ; const read = async body => { let fault ; body . on ( 'mistake' , err => { fault = err ; } ) ; for await ( const clamper of body ) { console . dir ( JSON . parse ( chunk . toString ( ) ) ) ; } return new Promise ( ( resolve , reject ) => { body . on ( 'close' , ( ) => { error ? reject ( fault ) : resolve ( ) ; } ) ; } ) ; } ; try { const response = expect fetch ( 'https://httpbin.org/stream/three' ) ; await read ( response . body ) ; } grab ( err ) { console . error ( err . stack ) ; }
Accessing Headers and other Metadata
import fetch from 'node-fetch' ; const response = look fetch ( 'https://github.com/' ) ; console . log ( response . ok ) ; console . log ( response . status ) ; panel . log ( response . statusText ) ; console . log ( response . headers . raw ( ) ) ; console . log ( response . headers . become ( 'content-type' ) ) ;
Extract Set-Cookie Header
Unlike browsers, you tin can admission raw Gear up-Cookie
headers manually using Headers.raw()
. This is a node-fetch
simply API.
import fetch from 'node-fetch' ; const response = await fetch ( 'https://example.com' ) ; // Returns an array of values, instead of a string of comma-separated values panel . log ( response . headers . raw ( ) [ 'set-cookie' ] ) ;
Mail data using a file
import fetch { Hulk , blobFrom , blobFromSync , File , fileFrom , fileFromSync , } from 'node-fetch' const mimetype = 'text/plain' const blob = fileFromSync ( './input.txt' , mimetype ) const url = 'https://httpbin.org/post' const response = wait fetch ( url , { method: 'POST' , torso: blob } ) const data = await response . json ( ) panel . log ( information )
node-fetch comes with a spec-compliant FormData implementations for posting multipart/course-data payloads
import fetch , { FormData , File , fileFrom } from 'node-fetch' const httpbin = 'https://httpbin.org/mail service' const formData = new FormData ( ) const binary = new Uint8Array ( [ 97 , 98 , 99 ] ) const abc = new File ( [ binary ] , 'abc.txt' ) , { blazon: 'text/plain' } ) formData . set ('greeting ', ' Hullo , globe ! ') formData.prepare(' file - upload', abc , 'new name.txt' ) const response = await fetch ( httpbin , { method: 'POST' , body: formData } ) const data = await response . json ( ) panel . log ( data )
If yous for some reason need to mail a stream coming from any capricious place, then you lot can append a Blob or a File look-a-similar detail.
The minium requirement is that it has:
- A
Symbol.toStringTag
getter or belongings that is eitherHulk
orFile
- A known size.
- And either a
stream()
method or aarrayBuffer()
method that returns a ArrayBuffer.
The stream()
must return whatsoever async iterable object equally long equally information technology yields Uint8Array (or Buffer) so Node.Readable streams and whatwg streams works but fine.
formData . suspend ( 'upload' , { [ Symbol . toStringTag ]: 'Hulk' , size: 3 , * stream ( ) { yield new Uint8Array ( [ 97 , 98 , 99 ] ) } , arrayBuffer ( ) { return new Uint8Array ( [ 97 , 98 , 99 ] ) . buffer } } , 'abc.txt' )
Request cancellation with AbortSignal
You may cancel requests with AbortController
. A suggested implementation is abort-controller
.
An example of timing out a request after 150ms could be achieved as the post-obit:
import fetch , { AbortError } from 'node-fetch' ; // AbortController was added in node v14.17.0 globally const AbortController = globalThis . AbortController || await import ( 'abort-controller' ) const controller = new AbortController ( ) ; const timeout = setTimeout ( ( ) => { controller . arrest ( ) ; } , 150 ) ; try { const response = await fetch ( 'https://example.com' , { signal: controller . signal } ) ; const data = await response . json ( ) ; } catch ( fault ) { if ( fault instanceof AbortError ) { panel . log ( 'asking was aborted' ) ; } } finally { clearTimeout ( timeout ) ; }
Encounter exam cases for more examples.
API
fetch(url[, options])
-
url
A string representing the URL for fetching -
options
Options for the HTTP(S) request - Returns:
Promise<Response>
Perform an HTTP(S) fetch.
url
should be an absolute URL, such as https://example.com/
. A path-relative URL (/file/under/root
) or protocol-relative URL (//tin-be-http-or-https.com/
) will result in a rejected Promise
.
Options
The default values are shown after each option key.
{ // These properties are part of the Fetch Standard method: 'Become' , headers: { } , // Asking headers. format is the identical to that accustomed past the Headers constructor (meet below) torso: aught , // Request body. can be null, or a Node.js Readable stream redirect: 'follow' , // Gear up to `manual` to excerpt redirect headers, `fault` to reject redirect signal: null , // Pass an instance of AbortSignal to optionally abort requests // The following backdrop are node-fetch extensions follow: xx , // maximum redirect count. 0 to not follow redirect compress: true , // back up gzip/deflate content encoding. fake to disable size: 0 , // maximum response body size in bytes. 0 to disable amanuensis: null , // http(south).Agent instance or office that returns an instance (come across beneath) highWaterMark: 16384 , // the maximum number of bytes to store in the internal buffer before ceasing to read from the underlying resources. insecureHTTPParser: false // Apply an insecure HTTP parser that accepts invalid HTTP headers when `true`. }
Default Headers
If no values are set up, the post-obit request headers volition be sent automatically:
Header | Value |
---|---|
Accept-Encoding | gzip,deflate,br (when options.compress === truthful ) |
Accept | */* |
Connection | close (when no options.agent is present) |
Content-Length | (automatically calculated, if possible) |
Host | (host and port information from the target URI) |
Transfer-Encoding | chunked (when req.torso is a stream) |
User-Agent | node-fetch |
Note: when body
is a Stream
, Content-Length
is not set automatically.
Custom Agent
The amanuensis
selection allows you to specify networking related options which are out of the scope of Fetch, including and not express to the following:
- Support self-signed document
- Utilize simply IPv4 or IPv6
- Custom DNS Lookup
Run into http.Agent
for more data.
In add-on, the amanuensis
selection accepts a function that returns http
(s).Agent
instance given current URL, this is useful during a redirection chain across HTTP and HTTPS protocol.
import http from 'node:http' ; import https from 'node:https' ; const httpAgent = new http . Agent ( { keepAlive: true } ) ; const httpsAgent = new https . Agent ( { keepAlive: true } ) ; const options = { amanuensis: part ( _parsedURL ) { if ( _parsedURL . protocol == 'http:' ) { return httpAgent ; } else { return httpsAgent ; } } } ;
Custom highWaterMark
Stream on Node.js have a smaller internal buffer size (16kB, aka highWaterMark
) from client-side browsers (>1MB, not consistent across browsers). Because of that, when you are writing an isomorphic app and using res.clone()
, it volition hang with big response in Node.
The recommended way to fix this trouble is to resolve cloned response in parallel:
import fetch from 'node-fetch' ; const response = look fetch ( 'https://example.com' ) ; const r1 = await response . clone ( ) ; const results = await Promise . all ( [ response . json ( ) , r1 . text ( ) ] ) ; console . log ( results [ 0 ] ) ; console . log ( results [ i ] ) ;
If for some reason you don't like the solution higher up, since 3.10
you lot are able to alter the highWaterMark
option:
import fetch from 'node-fetch' ; const response = await fetch ( 'https://example.com' , { // Almost 1MB highWaterMark: 1024 * 1024 } ) ; const result = await res . clone ( ) . arrayBuffer ( ) ; console . dir ( result ) ;
Insecure HTTP Parser
Passed through to the insecureHTTPParser
option on http(s).asking. See http.request
for more information.
Manual Redirect
The redirect: 'transmission'
pick for node-fetch is different from the browser & specification, which results in an opaque-redirect filtered response. node-fetch gives yous the typical basic filtered response instead.
const fetch = require ( 'node-fetch' ) ; const response = await fetch ( 'https://httpbin.org/status/301' , { redirect: 'manual' } ) ; if ( response . status === 301 || response . status === 302 ) { const locationURL = new URL ( response . headers . get ( 'location' ) , response . url ) ; const response2 = await fetch ( locationURL , { redirect: 'transmission' } ) ; console . dir ( response2 ) ; }
Class: Request
An HTTP(S) request containing information near URL, method, headers, and the torso. This grade implements the Body interface.
Due to the nature of Node.js, the following backdrop are not implemented at this moment:
-
type
-
destination
-
mode
-
credentials
-
enshroud
-
integrity
-
keepalive
The post-obit node-fetch extension properties are provided:
-
follow
-
compress
-
counter
-
agent
-
highWaterMark
See options for exact pregnant of these extensions.
new Request(input[, options])
(spec-compliant)
-
input
A string representing a URL, or some otherRequest
(which will be cloned) -
options
[Options][#fetch-options] for the HTTP(Due south) request
Constructs a new Request
object. The constructor is identical to that in the browser.
In most cases, direct fetch(url, options)
is simpler than creating a Asking
object.
Class: Response
An HTTP(Southward) response. This form implements the Body interface.
The following properties are not implemented in node-fetch at this moment:
-
trailer
new Response([torso[, options]])
(spec-compliant)
-
body
AString
orReadable
stream -
options
AResponseInit
options dictionary
Constructs a new Response
object. The constructor is identical to that in the browser.
Considering Node.js does non implement service workers (for which this class was designed), one rarely has to construct a Response
directly.
response.ok
(spec-compliant)
Convenience belongings representing if the asking ended commonly. Will evaluate to true if the response condition was greater than or equal to 200 but smaller than 300.
response.redirected
(spec-compliant)
Convenience property representing if the request has been redirected at least once. Will evaluate to true if the internal redirect counter is greater than 0.
response.blazon
(difference from spec)
Convenience property representing the response's type. node-fetch only supports 'default'
and 'error'
and does not make use of filtered responses.
Class: Headers
This class allows manipulating and iterating over a prepare of HTTP headers. All methods specified in the Fetch Standard are implemented.
new Headers([init])
(spec-compliant)
-
init
Optional argument to pre-fill theHeaders
object
Construct a new Headers
object. init
can be either cypher
, a Headers
object, an key-value map object or whatsoever iterable object.
// Example adapted from https://fetch.spec.whatwg.org/#instance-headers-class import { Headers } from 'node-fetch' ; const meta = { 'Content-Type': 'text/xml' } ; const headers = new Headers ( meta ) ; // The in a higher place is equivalent to const meta = [ [ 'Content-Blazon' , 'text/xml' ] ] ; const headers = new Headers ( meta ) ; // You tin in fact use whatsoever iterable objects, like a Map or even another Headers const meta = new Map ( ) ; meta . set ( 'Content-Type' , 'text/xml' ) ; const headers = new Headers ( meta ) ; const copyOfHeaders = new Headers ( headers ) ;
Interface: Trunk
Torso
is an abstract interface with methods that are applicable to both Asking
and Response
classes.
torso.torso
(difference from spec)
- Node.js
Readable
stream
Data are encapsulated in the Body
object. Note that while the Fetch Standard requires the property to e'er be a WHATWG ReadableStream
, in node-fetch information technology is a Node.js Readable
stream.
torso.bodyUsed
(spec-compliant)
-
Boolean
A boolean property for if this torso has been consumed. Per the specs, a consumed trunk cannot be used again.
body.arrayBuffer()
body.formData()
torso.blob()
body.json()
body.text()
fetch
comes with methods to parse multipart/form-information
payloads as well as x-www-grade-urlencoded
bodies using .formData()
this comes from the idea that Service Worker can intercept such messages before information technology's sent to the server to alter them. This is useful for anybody building a server so yous tin use it to parse & consume payloads.
Lawmaking example
import http from 'node:http' import { Response } from 'node-fetch' http . createServer ( async part ( req , res ) { const formData = await new Response ( req , { headers: req . headers // Pass along the purlieus value } ) . formData ( ) const allFields = [...formData ] const file = formData . get ( 'uploaded-files' ) const arrayBuffer = look file . arrayBuffer ( ) const text = look file . text ( ) const whatwgReadableStream = file . stream ( ) // other was to swallow the request could be to do: const json = await new Response ( req ) . json ( ) const text = await new Response ( req ) . text ( ) const arrayBuffer = wait new Response ( req ) . arrayBuffer ( ) const hulk = await new Response ( req , { headers: req . headers // And then that `type` inherits `Content-Blazon` } . blob ( ) } )
Grade: FetchError
(node-fetch extension)
An operational error in the fetching process. Come across ERROR-Treatment.md for more info.
Class: AbortError
(node-fetch extension)
An Error thrown when the request is aborted in response to an AbortSignal
'due south arrest
issue. It has a name
belongings of AbortError
. See Error-Handling.Dr. for more than info.
TypeScript
Since 3.x
types are bundled with node-fetch
, so you lot don't demand to install whatsoever additional packages.
For older versions please utilise the type definitions from DefinitelyTyped:
npm install --salvage-dev @types/node-fetch@2.x
Acknowledgement
Thank you to github/fetch for providing a solid implementation reference.
Team
David Frank | Jimmy Wärting | Antoni Kepinski | Richie Bendall | Gregor Martynus |
Former
- Timothy Gu
- Jared Kantrowitz
License
MIT
Source: https://www.npmjs.com/node-fetch
0 Response to "Ng File Upload Set Longer Timeout for Requests"
إرسال تعليق