A Metamask fork with Infura removed and default networks editable
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ciphermask/ui/helpers/utils/fetch-with-cache.js

57 lines
1.7 KiB

import { MINUTE, SECOND } from '../../../shared/constants/time';
import getFetchWithTimeout from '../../../shared/modules/fetch-with-timeout';
import { getStorageItem, setStorageItem } from './storage-helpers';
const fetchWithCache = async (
url,
fetchOptions = {},
{ cacheRefreshTime = MINUTE * 6, timeout = SECOND * 30 } = {},
) => {
if (
fetchOptions.body ||
(fetchOptions.method && fetchOptions.method !== 'GET')
) {
throw new Error('fetchWithCache only supports GET requests');
}
if (!(fetchOptions.headers instanceof window.Headers)) {
fetchOptions.headers = new window.Headers(fetchOptions.headers);
}
if (
fetchOptions.headers.has('Content-Type') &&
fetchOptions.headers.get('Content-Type') !== 'application/json'
) {
throw new Error('fetchWithCache only supports JSON responses');
}
const currentTime = Date.now();
const cacheKey = `cachedFetch:${url}`;
const { cachedResponse, cachedTime } = (await getStorageItem(cacheKey)) || {};
if (cachedResponse && currentTime - cachedTime < cacheRefreshTime) {
return cachedResponse;
}
fetchOptions.headers.set('Content-Type', 'application/json');
const fetchWithTimeout = getFetchWithTimeout(timeout);
const response = await fetchWithTimeout(url, {
referrerPolicy: 'no-referrer-when-downgrade',
body: null,
method: 'GET',
mode: 'cors',
...fetchOptions,
});
if (!response.ok) {
throw new Error(
`Fetch failed with status '${response.status}': '${response.statusText}'`,
);
}
const responseJson = await response.json();
const cacheEntry = {
cachedResponse: responseJson,
cachedTime: currentTime,
};
Fix `fetch-with-cache` handling of interwoven requests (#10079) A data race was introduced in #9919 when the old synchronous storage API was replaced with an async storage API. The problem arises when `fetchWithCache` is called a second time while it's still processing another call. In this case, the `cachedFetch` object can become stale while blocked waiting for a fetch response, and result in a cache being overwritten unintentionally. See this example (options omitted for simplicity, and assuming an empty initial cache): ``` await Promise.all([ fetchWithCache('https://metamask.io/foo'), fetchWithCache('https://metamask.io/bar'), ] ``` The order of events could be as follows: 1. Empty cache retrieved for `/foo` route 2. Empty cache retrieved for `/bar` route 3. Call made to `/foo` route 4. Call made to `/bar` route 5. `/foo` response is added to the empty cache object retrieved in step 1, then is saved in the cache. 6. `/bar` response is added to the empty cache object retrieved in step 2, then is saved in the cache. In step 6, the cache object saved would not contain the `/foo` response set in step 5. As a result, `/foo` would never be cached. This problem was resolved by embedding the URL being cached directly in the cache key. This prevents simultaneous responses from overwriting each others caches. Technically a data race still exists when handing simultaneous responses to the same route, but the result would be that the last call to finish would overwrite the previous. This seems acceptable.
4 years ago
await setStorageItem(cacheKey, cacheEntry);
return responseJson;
};
export default fetchWithCache;