Crypto.js 未捕获 RangeError:数组长度无效

问题描述 投票:0回答:1

我需要做

uploadFile
,我使用Crypto.js用MD5加密文件。 但当文件大小超过200Mb时。我得到了this error message

未捕获的范围错误:数组长度无效

我想知道我该如何解决这个问题?为什么我会收到此错误消息?

这是我的代码:

<script>
import CryptoJS from 'crypto-js'
const calculateMD5 = (file) => {
  return new Promise((resolve, reject) => {
    const fileReader = new FileReader()
    fileReader.onload = function () {
      const hash = CryptoJS.MD5(CryptoJS.lib.WordArray.create(fileReader.result))
      resolve(hash.toString())
    }
    fileReader.onerror = reject
    fileReader.readAsArrayBuffer(file)
  })
}

const splitFile = async (event: Event) => {
  const target = event.target as HTMLInputElement
  const CHUNK_SIZE = 10 * 1024 * 1024
  if (target.files && target.isDefaultNamespace.length > 0) {
    const file = target.files[0]
    const fileSize = file.size
    const md5hash = await calculateMD5(file)
    console.log(md5hash)

    const chunks = Math.ceil(fileSize / CHUNK_SIZE)
    const fileChunks = []
    let start = 0
    let end = CHUNK_SIZE
    console.log(end, fileSize)

    for (let i = 0; i < chunks; i++) {
      if (end > fileSize) {
        console.log('if', i)

        end = fileSize
      }
      const chunk = file.slice(start, end)
      console.log(chunk)

      fileChunks.push(chunk)
      start = end
      end = start + CHUNK_SIZE
    }
    console.log(fileChunks)
  }
}
</script>
<template>
 <input type="file" @change="splitFile">
</template>

我查到的信息显示好像是有限制。

https://stackoverflow.com/questions/55465821/getting-heap-out-of-memory-error-even-when-available-heap-memory-is-much-larger

https://stackoverflow.com/questions/54452896/maximum-number-of-entries-in-node-js-map/54466812#54466812

javascript vue.js cryptojs
1个回答
0
投票

您应该逐步散列文件以避免使用分配给浏览器的所有资源。分块处理文件并使用 Spark-md5 等库增量计算 MD5 哈希值

import SparkMD5 from 'spark-md5';

const calculateMD5 = (file) => {
  return new Promise((resolve, reject) => {
    const CHUNK_SIZE = 10 * 1024 * 1024; // Chunk size: 10MB
    const spark = new SparkMD5.ArrayBuffer();
    const fileReader = new FileReader();
    let currentChunk = 0;
    const chunks = Math.ceil(file.size / CHUNK_SIZE);

    fileReader.onload = function(e) {
      spark.append(e.target.result); // Append chunk to the hash
      currentChunk++;
      if (currentChunk < chunks) {
        loadNext();
      } else {
        resolve(spark.end()); // Compute final hash
      }
    };

    fileReader.onerror = () => reject('Error reading file');

    const loadNext = () => {
      const start = currentChunk * CHUNK_SIZE;
      const end = Math.min(start + CHUNK_SIZE, file.size);
      fileReader.readAsArrayBuffer(file.slice(start, end));
    };

    loadNext();
  });
};

使用

spark-md5
进行增量哈希,可以减少内存使用量,并防止大文件出现“堆内存不足”错误,因为无需将整个文件一次加载到内存中

© www.soinside.com 2019 - 2024. All rights reserved.