将大文件上传到 S3 存储桶的最佳方式是什么

问题描述 投票:0回答:1

我正在使用 Vue js 为 Laravel 11 后端创建块,这些块应该从那里传输到 AWS S3 Buckets。 我为每个新块使用了一个附加,它正在工作,但进展缓慢。

    // Upload a video to S3
    public function uploadVideo(Request $request)
    {
        // Upload chunks and if is_last is true, combine them
        $file = $request->file('chunk');
        $isLast = $request->input('is_last');

        $file = $request->file('chunk');
        Storage::disk('s3')->append('test/videos/rrr/' . $file->getClientOriginalName(), $file->get(), null);

        if ($isLast == "true") {
            // Change the privacy of the file
            $path = 'test/videos/rrr/' . $file->getClientOriginalName();

            Storage::disk('s3')->setVisibility($path, 'public');
        }

        return response()->json([
            'status' => 'success',
            'message' => 'Chunk uploaded successfully',
        ]);
    }

我尝试了流式传输,似乎它一直工作到最后,但是当我打开视频文件时,它显示黑屏。

    // Upload a video to S3
    public function uploadVideo(Request $request)
    {
        $file = $request->file('chunk');

        $stream = fopen($file->getRealPath(), 'r');
        
        Storage::disk('s3')->put('test/videos/streams/' . $file->getClientOriginalName(), $stream, 'public');

        fclose($stream);

        return response()->json([
            'status' => 'success',
            'message' => 'Chunk uploaded successfully',
        ]);
    }

那么问题是什么以及解决它的最佳方法?

laravel amazon-web-services vue.js amazon-s3 chunks
1个回答
0
投票

当您使用 JavaScript 时,请查看使用 AWS SDK for JavaScript V3 和分段上传。分段上传功能允许您以较小的块上传大文件。这对于大文件很有用,因为它可以降低上传过程中失败的风险,并可以更好地处理网络问题或超时。

以下是如何执行分段上传的示例:

import { fileURLToPath } from "url";

// snippet-start:[javascript.v3.s3.scenarios.multipartupload]
import {
  CreateMultipartUploadCommand,
  UploadPartCommand,
  CompleteMultipartUploadCommand,
  AbortMultipartUploadCommand,
  S3Client,
} from "@aws-sdk/client-s3";

const twentyFiveMB = 25 * 1024 * 1024;

export const createString = (size = twentyFiveMB) => {
  return "x".repeat(size);
};

export const main = async () => {
  const s3Client = new S3Client({});
  const bucketName = "test-bucket";
  const key = "multipart.txt";
  const str = createString();
  const buffer = Buffer.from(str, "utf8");

  let uploadId;

  try {
    const multipartUpload = await s3Client.send(
      new CreateMultipartUploadCommand({
        Bucket: bucketName,
        Key: key,
      }),
    );

    uploadId = multipartUpload.UploadId;

    const uploadPromises = [];
    // Multipart uploads require a minimum size of 5 MB per part.
    const partSize = Math.ceil(buffer.length / 5);

    // Upload each part.
    for (let i = 0; i < 5; i++) {
      const start = i * partSize;
      const end = start + partSize;
      uploadPromises.push(
        s3Client
          .send(
            new UploadPartCommand({
              Bucket: bucketName,
              Key: key,
              UploadId: uploadId,
              Body: buffer.subarray(start, end),
              PartNumber: i + 1,
            }),
          )
          .then((d) => {
            console.log("Part", i + 1, "uploaded");
            return d;
          }),
      );
    }

    const uploadResults = await Promise.all(uploadPromises);

    return await s3Client.send(
      new CompleteMultipartUploadCommand({
        Bucket: bucketName,
        Key: key,
        UploadId: uploadId,
        MultipartUpload: {
          Parts: uploadResults.map(({ ETag }, i) => ({
            ETag,
            PartNumber: i + 1,
          })),
        },
      }),
    );

    // Verify the output by downloading the file from the Amazon Simple Storage Service (Amazon S3) console.
    // Because the output is a 25 MB string, text editors might struggle to open the file.
  } catch (err) {
    console.error(err);

    if (uploadId) {
      const abortCommand = new AbortMultipartUploadCommand({
        Bucket: bucketName,
        Key: key,
        UploadId: uploadId,
      });

      await s3Client.send(abortCommand);
    }
  }
};
// snippet-end:[javascript.v3.s3.scenarios.multipartupload]

// Invoke main function if this file was run directly.
if (process.argv[1] === fileURLToPath(import.meta.url)) {
  main();
}

您可以在 AWS Github 存储库中找到此示例和其他 JavaScipt V3 示例:

https://github.com/awsdocs/aws-doc-sdk-examples/tree/main/javascriptv3/example_code/s3

© www.soinside.com 2019 - 2024. All rights reserved.