SpringBoot 实现大文件断点续传(1G文件上传只需5s)

最近在工作中有涉及到文件上传功能,需求方要求文件最大上限为2G,此时如果直接将文件在前端做上传,会出现超长时间等待,如果服务端内存不够,会直接内存溢出,此时我们可以通过断点续传方式解决,前端我们通过WebUploader实现文件分隔和上传,后端我们通过SpringBoot实现文件接收和组装功能,下面我列出前后端主要功能代码,全部前端案例代码在文末。

1、前端代码

我们在前端代码中注册3个事件,分别是 before-send-file、before-send、after-send-file,这三个钩子分别是在发送文件前(上传文件之前执行,触发一次)、发送请求前(上传文件分块之前执行,触发多次)、文件上传后(分块全部上传完成之后执行,触发一次)

WebUploader.Uploader.register({  
	'name': 'webUploaderHookCommand',  
	'before-send-file': 'beforeSendFile',  
	"before-send": "beforeSend",
	"after-send-file" : "afterSendFile"
}, {  
	beforeSendFile: function(file) {  
		var task = new WebUploader.Deferred();  
		this.fileName = file.name;  
		this.fileSize = file.size;  
		this.mimetype = file.type;
		this.fileExt = file.ext;
		(new WebUploader.Uploader()).md5File(file, 0, 10 * 1024 * 1024 * 1024 * 1024).progress(function(percentage) {}).then(function(val) {  
			this.fileMd5 = val;  
			var url = "http://10.12.2.169:32018/resource/breakpointRenewal/register";  
			var data = {  
				fileMd5: this.fileMd5,
				fileName: file.name,
				fileSize: file.size,
				mimetype: file.type,
				fileExt: file.ext
			};  
			$.ajax({  
				type: "POST",  
				url: url,  
				type:"POST",
				data:data,
				cache: false,  
				async: false, // 同步  
				timeout: 1000, // todo 超时的话,只能认为该分片未上传过  
				dataType: "json",  
				error: function(XMLHttpRequest, textStatus, errorThrown) {  
					file.statusText = 'server_error';  
					task.reject();  
				}  
			}).then(function(data, textStatus, jqXHR) {  
				if(data.status == 1) {
					task.resolve();
				} else if(data.code == 0) {  
					file.statusText = data.message;  
					task.reject();  
				}  
			});  
		}.bind(this));
		return task.promise(); 
	}.bind(this),  
	beforeSend: function(block) {  
		var task = new WebUploader.Deferred();  
		console.log("start beforeSend");
		console.log("fileMd5: " + this.fileMd5);
		console.log("chunkSize: " + block.chunk);
		console.log("end beforeSend");
		var url = "http://10.12.2.169:32018/resource/breakpointRenewal/checkChunk";  
		var data = {    
			fileMd5: this.fileMd5,  
			chunk: block.chunk,  
			chunkSize: block.end - block.start
		};  
		$.ajax({  
			type: "POST",  
			url: url,  
			data: data,  
			cache: false,  
			async: false, // 同步  
			timeout: 1000, // todo 超时的话,只能认为该分片未上传过  
			dataType: "json"  
		}).then(function(data, textStatus, jqXHR) { 
			console.log(data.data);
			if(data.data == true) {  
				task.reject(); // 分片存在,则跳过上传  
			} else {  
				task.resolve();  
			}  
		});  
		uploader.options.formData.fileMd5 = this.fileMd5;
		uploader.options.formData.chunk = block.chunk;
		return task.promise(); 
	}.bind(this),
	afterSendFile: function() {  
		console.log("start afterSendFile");
		console.log("fileMd5: " + this.fileMd5);
		console.log("fileName: " + this.fileName);
		console.log("fileSize: " + this.fileSize);
		console.log("mimetype: " + this.mimetype);
		console.log("fileExt: " + this.fileExt);
		console.log("end afterSendFile");
		var url = "http://10.12.2.169:32018/resource/breakpointRenewal/mergeChunks";  
		var data = {    
			fileMd5: this.fileMd5,  
			fileName: this.fileName,  
			fileSize: this.fileSize,
			mimetype: this.mimetype,
			fileExt: this.fileExt,
			token: "228F58F34EA7433B85BAB5C44B46853A"
		};  
		$.ajax({  
			type: "POST",  
			url: url,  
			data: data 
		}).then(function(data, textStatus, jqXHR) { 
			console.log(data.data);
			if(data.data == true) {  
				task.reject(); // 分片存在,则跳过上传  
			} else {  
				task.resolve();  
			}  
		}); 
	}.bind(this)
})

我们注册完这三个事件后就可以初始化webuploader组件了,代码如下:

// 实例化
uploader = WebUploader.create({
	pick: {
		id: '#filePicker',
		label: '点击选择图片'
	},
	dnd: '#uploader .queueList',
	paste: '#uploader',
	swf: '../../dist/Uploader.swf',
	chunked:true,
	chunkSize: 5 * 1024 * 1024,
	threads:3,
	prepareNextFile:true,
	//server: '../../server/fileupload.php',
	server: 'http://10.12.2.169:32018/resource/breakpointRenewal',
	// runtimeOrder: 'flash',

	// accept: {
	//     title: 'Images',
	//     extensions: 'gif,jpg,jpeg,bmp,png',
	//     mimeTypes: 'image/*'
	// },

	// 禁掉全局的拖拽功能。这样不会出现图片拖进页面的时候,把图片打开。
	disableGlobalDnd: true,
	fileNumLimit: 300,
	fileSizeLimit: 2 * 1024 * 1024 * 1024,    // 2G
	fileSingleSizeLimit: 2 * 1024 * 1024 * 1024    // 2G
});

到这里前端主要代码就写完了,前端完整案例代码见本文附件。

2、FileController代码

package com.openailab.oascloud.file.controller;

import com.openailab.oascloud.common.model.RequestParameter;
import com.openailab.oascloud.common.model.ResponseResult;
import com.openailab.oascloud.common.model.tcm.vo.ResourceVO;
import com.openailab.oascloud.file.api.IFileController;
import com.openailab.oascloud.file.model.LoginUserInfo;
import com.openailab.oascloud.file.service.IFileService;
import com.openailab.oascloud.file.service.IUserService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RequestPart;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.multipart.MultipartFile;

import javax.servlet.http.HttpServletResponse;
import java.util.Optional;

/**
 * @description: 文件管理-Controller
 * @author: zhangzhixiang
 * @createDate: 2019/12/9
 * @version: 1.0
 */
@RestController
public class FileController extends BaseController implements IFileController {

    private static Logger LOG = LoggerFactory.getLogger(FileController.class);
    @Autowired
    private IFileService fileService;
    @Autowired
    private IUserService userService;

    /**
     * 断点叙传
     *
     * @param file
     * @param fileMd5
     * @param chunk
     * @return com.openailab.oascloud.common.model.ResponseResult
     * @author zxzhang
     * @date 2020/1/13
     */
    @Override
    public ResponseResult breakpointRenewal(@RequestPart("file") MultipartFile file,
                                            @RequestParam("fileMd5") String fileMd5,
                                            @RequestParam("chunk") Integer chunk) {
        try {
            return fileService.breakpointRenewal(file, fileMd5, chunk);
        } catch (Exception e) {
            LOG.error("********FileController->breakpointRenewal throw Exception.fileMd5:{},chunk:{}********", fileMd5, chunk, e);
        }
        return ResponseResult.fail(null);
    }

    /**
     * 断点叙传注册
     *
     * @param fileMd5
     * @param fileName
     * @param fileSize
     * @param mimetype
     * @param fileExt
     * @return com.openailab.oascloud.common.model.ResponseResult
     * @author zxzhang
     * @date 2020/1/13
     */
    @Override
    public ResponseResult breakpointRegister(@RequestParam("fileMd5") String fileMd5,
                                             @RequestParam("fileName") String fileName,
                                             @RequestParam("fileSize") Long fileSize,
                                             @RequestParam("mimetype") String mimetype,
                                             @RequestParam("fileExt") String fileExt) {
        try {
            return fileService.breakpointRegister(fileMd5, fileName, fileSize, mimetype, fileExt);
        } catch (Exception e) {
            LOG.error("********FileController->breakpointRegister throw Exception.fileMd5:{},fileName:{}********", fileMd5, fileName, e);
        }
        return ResponseResult.fail(null);
    }

    /**
     * 检查分块是否存在
     *
     * @param fileMd5
     * @param chunk
     * @param chunkSize
     * @return com.openailab.oascloud.common.model.ResponseResult
     * @author zxzhang
     * @date 2020/1/10
     */
    @Override
    public ResponseResult checkChunk(@RequestParam("fileMd5") String fileMd5,
                                     @RequestParam("chunk") Integer chunk,
                                     @RequestParam("chunkSize") Integer chunkSize) {
        try {
            return fileService.checkChunk(fileMd5, chunk, chunkSize);
        } catch (Exception e) {
            LOG.error("********FileController->breakpointRenewal throw Exception.fileMd5:{},chunk:{}********", fileMd5, chunk, e);
        }
        return ResponseResult.fail(null);
    }

    /**
     * 合并文件块
     *
     * @param fileMd5
     * @param fileName
     * @param fileSize
     * @param mimetype
     * @param fileExt
     * @return com.openailab.oascloud.common.model.ResponseResult
     * @author zxzhang
     * @date 2020/1/11
     */
    @Override
    public ResponseResult mergeChunks(@RequestParam("fileMd5") String fileMd5,
                                      @RequestParam("fileName") String fileName,
                                      @RequestParam("fileSize") Long fileSize,
                                      @RequestParam("mimetype") String mimetype,
                                      @RequestParam("fileExt") String fileExt,
                                      @RequestParam("token") String token) {
        try {
            LoginUserInfo user = userService.getLoginUser(token);
            return fileService.mergeChunks(fileMd5, fileName, fileSize, mimetype, fileExt, user);
        } catch (Exception e) {
            LOG.error("********FileController->breakpointRenewal throw Exception.fileMd5:{},fileName:{}********", fileMd5, fileName, e);
        }
        return ResponseResult.fail(null);
    }
}

2、IFileService代码

package com.openailab.oascloud.file.service;

import com.openailab.oascloud.common.model.RequestParameter;
import com.openailab.oascloud.common.model.ResponseResult;
import com.openailab.oascloud.common.model.tcm.vo.ResourceVO;
import com.openailab.oascloud.file.model.LoginUserInfo;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RequestPart;
import org.springframework.web.multipart.MultipartFile;

import javax.servlet.http.HttpServletResponse;
import java.io.File;
import java.io.FileNotFoundException;
import java.util.Date;
import java.util.List;

/**
 * @description: 文件管理-Interface
 * @author: zhangzhixiang
 * @createDate: 2019/12/9
 * @version: 1.0
 */
public interface IFileService {
    /**
     * 断点叙传注册
     *
     * @param fileMd5
     * @param fileName
     * @param fileSize
     * @param mimetype
     * @param fileExt
     * @return com.openailab.oascloud.common.model.ResponseResult
     * @author zxzhang
     * @date 2020/1/10
     */
    ResponseResult breakpointRegister(String fileMd5, String fileName, Long fileSize, String mimetype, String fileExt);

    /**
     * 断点叙传
     *
     * @param file
     * @return com.openailab.oascloud.common.model.ResponseResult
     * @author zxzhang
     * @date 2019/12/9
     */
    ResponseResult breakpointRenewal(MultipartFile file, String fileMd5, Integer chunk);

    /**
     * 检查分块是否存在
     *
     * @param fileMd5
     * @param chunk
     * @param chunkSize
     * @return com.openailab.oascloud.common.model.ResponseResult
     * @author zxzhang
     * @date 2020/1/10
     */
    ResponseResult checkChunk(String fileMd5, Integer chunk, Integer chunkSize);

    /**
     * 合并文件块
     *
     * @param fileMd5
     * @param fileName
     * @param fileSize
     * @param mimetype
     * @param fileExt
     * @return com.openailab.oascloud.common.model.ResponseResult
     * @author zxzhang
     * @date 2020/1/11
     */
    ResponseResult mergeChunks(String fileMd5, String fileName, Long fileSize, String mimetype, String fileExt, LoginUserInfo user);
}

3、FileServiceImpl代码

package com.openailab.oascloud.file.service.impl;

import com.alibaba.fastjson.JSONObject;
import com.github.pagehelper.PageInfo;
import com.google.common.collect.Maps;
import com.openailab.oascloud.common.enums.ResponseEnum;
import com.openailab.oascloud.common.model.ResponseResult;
import com.openailab.oascloud.common.model.tcm.ResourceBO;
import com.openailab.oascloud.common.model.tcm.vo.ResourceVO;
import com.openailab.oascloud.common.model.um.FileUserBO;
import com.openailab.oascloud.file.common.config.BootstrapConfig;
import com.openailab.oascloud.file.common.consts.BootstrapConst;
import com.openailab.oascloud.file.common.consts.RedisPrefixConst;
import com.openailab.oascloud.file.common.enums.ResourceTypeEnum;
import com.openailab.oascloud.file.common.enums.TranscodingStateEnum;
import com.openailab.oascloud.file.common.enums.VedioEnum;
import com.openailab.oascloud.file.common.file.ClientFactory;
import com.openailab.oascloud.file.common.file.FileClient;
import com.openailab.oascloud.file.common.helper.FileManagementHelper;
import com.openailab.oascloud.file.dao.FileDao;
import com.openailab.oascloud.file.dao.RedisDao;
import com.openailab.oascloud.file.model.LoginUserInfo;
import com.openailab.oascloud.file.service.IFileService;
import com.openailab.oascloud.file.util.*;
import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.compress.utils.IOUtils;
import org.apache.commons.lang.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeanUtils;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.util.ObjectUtils;
import org.springframework.web.multipart.MultipartFile;

import javax.servlet.http.HttpServletResponse;
import java.io.*;
import java.text.SimpleDateFormat;
import java.util.*;

/**
 * @description: 文件管理-service
 * @author: zhangzhixiang
 * @createDate: 2019/12/9
 * @version: 1.0
 */
@Service
public class FileServiceImpl implements IFileService {

    private final static Logger LOG = LoggerFactory.getLogger(FileServiceImpl.class);
    private static final SimpleDateFormat format = new SimpleDateFormat("yyyyMMdd");
    @Autowired
    private FileDao fileDao;
    @Autowired
    private BootstrapConfig bootstrapConfig;
    @Autowired
    private FileManagementHelper fileManagementHelper;
    @Autowired
    private PageObjUtils pageObjUtils;
    @Autowired
    private RedisDao redisDao;

    private String getUploadPath() {
        return bootstrapConfig.getFileRoot() + bootstrapConfig.getUploadDir() + "/";
    }

    private String getFileFolderPath(String fileMd5) {
        return getUploadPath() + fileMd5.substring(0, 1) + "/" + fileMd5.substring(1, 2) + "/";
    }

    private String getFilePath(String fileMd5, String fileExt) {
        return getFileFolderPath(fileMd5) + fileMd5 + "." + fileExt;
    }

    private String getFileRelativePath(String fileMd5, String fileExt) {
        return bootstrapConfig.getUploadDir() + "/" + fileMd5.substring(0, 1) + "/" + fileMd5.substring(1, 2) + "/" + fileMd5 + "." + fileExt;
    }

    private String getChunkFileFolderPath(String fileMd5) {
        return bootstrapConfig.getFileRoot() + bootstrapConfig.getBreakpointDir() + "/" + fileMd5 + "/";
    }

    @Override
    public ResponseResult breakpointRegister(String fileMd5, String fileName, Long fileSize, String mimetype, String fileExt) {
        Map<String, String> ret = Maps.newHashMap();
        // 检查文件是否存在于磁盘
        String fileFolderPath = this.getFileFolderPath(fileMd5);
        String filePath = this.getFilePath(fileMd5, fileExt);
        File file = new File(filePath);
        boolean exists = file.exists();

        // 检查文件是否存在于PostgreSQL中 (文件唯一标识为 fileMd5)
        ResourceBO resourceBO = new ResourceBO();
        resourceBO.setFileMd5(fileMd5);
        resourceBO.setIsDelete(0);
        List<ResourceBO> resourceBOList = fileDao.selectResourceByCondition(resourceBO);
        if (exists && resourceBOList.size() > 0) {
            // 既存在于磁盘又存在于数据库说明该文件存在,直接返回resId、filePath
            resourceBO = resourceBOList.get(0);
            ret.put("filePath", resourceBO.getFilePath());
            ret.put("resId", String.valueOf(resourceBO.getResourceId()));
            return ResponseResult.fail(ResponseEnum.RESPONSE_CODE_BREAKPOINT_RENEVAL_REGISTRATION_ERROR, ret);
        }

        //若磁盘中存在,但数据库中不存在,则生成resource记录并存入redis中
        if (resourceBOList.size() == 0) {
            // 首次断点叙传的文件需要创建resource新记录并返回redId,并存入redis中
            resourceBO.setType(fileManagementHelper.judgeDocumentType(fileExt));
            resourceBO.setStatus(TranscodingStateEnum.UPLOAD_NOT_COMPLETED.getCode());
            resourceBO.setFileSize(fileSize);
            resourceBO.setFileMd5(fileMd5);
            resourceBO.setFileName(fileName);
            resourceBO.setCreateDate(new Date());
            resourceBO.setIsDelete(0);
            final Integer resourceId = fileDao.addResource(resourceBO);
            resourceBO.setResourceId(resourceId);
            redisDao.set(RedisPrefixConst.BREAKPOINT_PREFIX + fileMd5, JSONObject.toJSONString(resourceBO), RedisPrefixConst.EXPIRE);
        }

        //如果redis中不存在,但数据库中存在,则存入redis中
        String breakpoint = redisDao.get(RedisPrefixConst.BREAKPOINT_PREFIX + fileMd5);
        if (StringUtils.isEmpty(breakpoint) && resourceBOList.size() > 0) {
            resourceBO = resourceBOList.get(0);
            redisDao.set(RedisPrefixConst.BREAKPOINT_PREFIX + fileMd5, JSONObject.toJSONString(resourceBO), RedisPrefixConst.EXPIRE);
        }

        // 若文件不存在则检查文件所在目录是否存在
        File fileFolder = new File(fileFolderPath);
        if (!fileFolder.exists()) {
            // 不存在创建该目录 (目录就是根据前端传来的MD5值创建的)
            fileFolder.mkdirs();
        }
        return ResponseResult.success(null);
    }

    @Override
    public ResponseResult breakpointRenewal(MultipartFile file, String fileMd5, Integer chunk) {
        Map<String, String> ret = Maps.newHashMap();
        // 检查分块目录是否存在
        String chunkFileFolderPath = this.getChunkFileFolderPath(fileMd5);
        File chunkFileFolder = new File(chunkFileFolderPath);
        if (!chunkFileFolder.exists()) {
            chunkFileFolder.mkdirs();
        }
        // 上传文件输入流
        File chunkFile = new File(chunkFileFolderPath + chunk);
        try (InputStream inputStream = file.getInputStream(); FileOutputStream outputStream = new FileOutputStream(chunkFile)) {
            IOUtils.copy(inputStream, outputStream);
            // redis中查找是否有fileMd5的分块记录(resId)
            String breakpoint = redisDao.get(RedisPrefixConst.BREAKPOINT_PREFIX + fileMd5);
            ResourceBO resourceBO = new ResourceBO();
            if (!StringUtils.isEmpty(breakpoint)) {
                // 存在分块记录说明资源正在上传中,直接返回fileMd5对应的resId,且不再重复创建resource记录
                resourceBO = JSONObject.parseObject(breakpoint, ResourceBO.class);
                ret.put("resId", String.valueOf(resourceBO.getResourceId()));
            }
        } catch (IOException e) {
            e.printStackTrace();
        }
        return ResponseResult.success(ret);
    }

    @Override
    public ResponseResult checkChunk(String fileMd5, Integer chunk, Integer chunkSize) {
        // 检查分块文件是否存在
        String chunkFileFolderPath = this.getChunkFileFolderPath(fileMd5);
        // 分块所在路径+分块的索引可定位具体分块
        File chunkFile = new File(chunkFileFolderPath + chunk);
        if (chunkFile.exists() && chunkFile.length() == chunkSize) {
            return ResponseResult.success(true);
        }
        return ResponseResult.success(false);
    }

    @Override
    public ResponseResult mergeChunks(String fileMd5, String fileName, Long fileSize, String mimetype, String fileExt, LoginUserInfo user) {
        FileClient fileClient = ClientFactory.createClientByType(bootstrapConfig.getFileClientType());
        String chunkFileFolderPath = this.getChunkFileFolderPath(fileMd5);
        File chunkFileFolder = new File(chunkFileFolderPath);
        File[] files = chunkFileFolder.listFiles();
        final String filePath = this.getFilePath(fileMd5, fileExt);
        File mergeFile = new File(filePath);
        List<File> fileList = Arrays.asList(files);

        // 1. 合并分块
        mergeFile = this.mergeFile(fileList, mergeFile);
        if (mergeFile == null) {
            return ResponseResult.fail(ResponseEnum.RESPONSE_CODE_MERGE_FILE_ERROR, null);
        }
        // 2、校验文件MD5是否与前端传入一致
        boolean checkResult = this.checkFileMd5(mergeFile, fileMd5);
        if (!checkResult) {
            return ResponseResult.fail(ResponseEnum.RESPONSE_CODE_VERIFY_FILE_ERROR, null);
        }

        // 3、删除该文件所有分块
        FileUtil.deleteDir(chunkFileFolderPath);
        // 4、在redis中获取文件分块记录
        String breakpoint = redisDao.get(RedisPrefixConst.BREAKPOINT_PREFIX + fileMd5);
        if (StringUtils.isEmpty(breakpoint)) {
            return ResponseResult.fail("文件分块不存在");
        }
        ResourceBO resourceBO = JSONObject.parseObject(breakpoint, ResourceBO.class);
        // 5、删除redis分块记录
        redisDao.del(RedisPrefixConst.BREAKPOINT_PREFIX + fileMd5);

        // 6、组装返回结果
        ret.put("filePath", filePath);
        ret.put("resId", String.valueOf(resourceBO.getResourceId()));
        return ResponseResult.success(ret);
    }

    /**
     * 合并文件
     *
     * @param chunkFileList
     * @param mergeFile
     * @return java.io.File
     * @author zxzhang
     * @date 2020/1/11
     */
    private File mergeFile(List<File> chunkFileList, File mergeFile) {
        try {
            // 有删 无创建
            if (mergeFile.exists()) {
                mergeFile.delete();
            } else {
                mergeFile.createNewFile();
            }
            // 排序
            Collections.sort(chunkFileList, (o1, o2) -> {
                if (Integer.parseInt(o1.getName()) > Integer.parseInt(o2.getName())) {
                    return 1;
                }
                return -1;
            });

            byte[] b = new byte[1024];
            RandomAccessFile writeFile = new RandomAccessFile(mergeFile, "rw");
            for (File chunkFile : chunkFileList) {
                RandomAccessFile readFile = new RandomAccessFile(chunkFile, "r");
                int len = -1;
                while ((len = readFile.read(b)) != -1) {
                    writeFile.write(b, 0, len);
                }
                readFile.close();
            }
            writeFile.close();
            return mergeFile;
        } catch (IOException e) {
            e.printStackTrace();
            return null;
        }
    }

    /**
     * 校验文件MD5
     *
     * @param mergeFile
     * @param md5
     * @return boolean
     * @author zxzhang
     * @date 2020/1/11
     */
    private boolean checkFileMd5(File mergeFile, String md5) {
        try {
            // 得到文件MD5
            FileInputStream inputStream = new FileInputStream(mergeFile);
            String md5Hex = DigestUtils.md5Hex(inputStream);

            if (StringUtils.equalsIgnoreCase(md5, md5Hex)) {
                return true;
            }

        } catch (Exception e) {
            e.printStackTrace();
        }
        return false;
    }

    /**
     * 获取文件后缀
     *
     * @param fileName
     * @return java.lang.String
     * @author zxzhang
     * @date 2019/12/10
     */
    public String getExt(String fileName) {
        return fileName.substring(fileName.lastIndexOf(".") + 1);
    }

    /**
     * 获取文件所在目录
     *
     * @param filePath
     * @return java.lang.String
     * @author zxzhang
     * @date 2019/12/10
     */
    public String getFileDir(String filePath) {
        return filePath.substring(0, filePath.lastIndexOf(BootstrapConst.PATH_SEPARATOR));
    }

    /**
     * 获取文件名
     *
     * @param filePath
     * @return java.lang.String
     * @author zxzhang
     * @date 2019/12/10
     */
    public String getFileName(String filePath) {
        return filePath.substring(filePath.lastIndexOf(BootstrapConst.PATH_SEPARATOR) + 1, filePath.lastIndexOf("."));
    }
}

4、FileManagementHelper代码

package com.openailab.oascloud.file.common.helper;

import com.openailab.oascloud.file.common.config.BootstrapConfig;
import com.openailab.oascloud.file.common.consts.BootstrapConst;
import com.openailab.oascloud.file.common.enums.DocumentEnum;
import com.openailab.oascloud.file.common.enums.ImageEnum;
import com.openailab.oascloud.file.common.enums.ResourceTypeEnum;
import com.openailab.oascloud.file.common.enums.VedioEnum;
import com.openailab.oascloud.file.util.FileUtil;
import org.apache.commons.lang.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;

import java.io.*;
import java.text.SimpleDateFormat;
import java.util.Arrays;
import java.util.Date;
import java.util.LinkedList;

/**
 * @description:
 * @author: zhangzhixiang
 * @createDate: 2019/12/11
 * @version: 1.0
 */
@Component
public class FileManagementHelper {

    private static final Logger LOG = LoggerFactory.getLogger(FileManagementHelper.class);
    @Autowired
    private BootstrapConfig bootstrapConfig;

    /**
     * 根据文件后缀判断类型
     *
     * @param ext
     * @return java.lang.Integer
     * @author zxzhang
     * @date 2019/12/10
     */
    public Integer judgeDocumentType(String ext) {
        //视频类
        if (VedioEnum.containKey(ext) != null) {
            return ResourceTypeEnum.VIDEO.getCode();
        }
        //图片类
        if (ImageEnum.containKey(ext) != null) {
            return ResourceTypeEnum.IMAGE.getCode();
        }
        //文档类
        if (DocumentEnum.containKey(ext) != null) {
            return ResourceTypeEnum.FILE.getCode();
        }
        //未知
        return ResourceTypeEnum.OTHER.getCode();
    }

    /**
     * 生成随机文件名称
     *
     * @param ext
     * @return java.lang.String
     * @author zxzhang
     * @date 2019/12/10
     */
    public static String createFileName(String ext) {
        SimpleDateFormat simpleDateFormat = new SimpleDateFormat("yyyyMMddHHmmss");
        return simpleDateFormat.format(new Date()) + (int) (Math.random() * 900 + 100) + ext;
    }

    /**
     * 获取文件后缀
     *
     * @param fileName
     * @return java.lang.String
     * @author zxzhang
     * @date 2019/12/10
     */
    public String getExt(String fileName) {
        return fileName.substring(fileName.lastIndexOf(".") + 1);
    }

    /**
     * 获取文件所在目录
     *
     * @param filePath
     * @return java.lang.String
     * @author zxzhang
     * @date 2019/12/10
     */
    public String getFileDir(String filePath) {
        return filePath.substring(0, filePath.lastIndexOf(BootstrapConst.PATH_SEPARATOR));
    }

    /**
     * 获取文件名
     *
     * @param filePath
     * @return java.lang.String
     * @author zxzhang
     * @date 2019/12/10
     */
    public String getFileName(String filePath) {
        return filePath.substring(filePath.lastIndexOf(BootstrapConst.PATH_SEPARATOR) + 1, filePath.lastIndexOf("."));
    }
}

5、RedisDao代码

package com.openailab.oascloud.file.dao;

import com.openailab.oascloud.common.consts.ServiceNameConst;
import com.openailab.oascloud.common.model.ResponseResult;
import org.springframework.cloud.openfeign.FeignClient;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestParam;

/**
 * @Classname: com.openailab.oascloud.datacenter.api.IRedisApi
 * @Description: Redis API
 * @Author: zxzhang
 * @Date: 2019/7/1
 */
@FeignClient(ServiceNameConst.OPENAILAB_DATA_CENTER_SERVICE)
public interface RedisDao {
    /**
     * @api {POST} /redis/set 普通缓存放入并设置过期时间
     * @apiGroup Redis
     * @apiVersion 0.1.0
     * @apiParam {String} key 键
     * @apiParam {String} value 值
     * @apiParam {long} expire 过期时间
     */
    @PostMapping("/redis/set")
    ResponseResult set(@RequestParam("key") String key, @RequestParam("value") String value, @RequestParam("expire") long expire);

    /**
     * @api {POST} /redis/get 普通缓存获取
     * @apiGroup Redis
     * @apiVersion 0.1.0
     * @apiParam {String} key 键
     * @apiSuccess {String} value 值
     */
    @PostMapping("/redis/get")
    String get(@RequestParam("key") String key);

    /**
     * @api {POST} /redis/del 普通缓存删除
     * @apiGroup Redis
     * @apiVersion 0.1.0
     * @apiParam {String} key 键
     */
    @PostMapping("/redis/del")
    ResponseResult del(@RequestParam("key") String key);

    /**
     * @api {POST} /redis/hset 存入Hash值并设置过期时间
     * @apiGroup Redis
     * @apiVersion 0.1.0
     * @apiParam {String} key 键
     * @apiParam {String} item 项
     * @apiParam {String} value 值
     * @apiParam {long} expire 过期时间
     */
    @PostMapping("/redis/hset")
    ResponseResult hset(@RequestParam("key") String key, @RequestParam("item") String item, @RequestParam("value") String value, @RequestParam("expire") long expire);

    /**
     * @api {POST} /redis/hget 获取Hash值
     * @apiGroup Redis
     * @apiVersion 0.1.0
     * @apiParam {String} key 键
     * @apiParam {String} item 项
     * @apiSuccess {String} value 值
     * @apiSuccessExample {json} 成功示例
     * {"name":"张三","age":30}
     */
    @PostMapping("/redis/hget")
    Object hget(@RequestParam("key") String key, @RequestParam("item") String item);

    /**
     * @api {POST} /redis/hdel 删除Hash值SaasAppKeyDao
     * @apiGroup Redis
     * @apiVersion 0.1.0
     * @apiParam {String} key 键
     * @apiParam {String} item 项
     */
    @PostMapping("/redis/hdel")
    ResponseResult hdel(@RequestParam("key") String key, @RequestParam("item") String item);
}

6、BootstrapConfig代码

package com.openailab.oascloud.file.common.config;

import lombok.Data;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;

/**
 * @Classname: com.openailab.oascloud.security.common.config.BootstrapConsts
 * @Description: 引导类
 * @Author: zxzhang
 * @Date: 2019/10/8
 */
@Data
@Configuration
public class BootstrapConfig {

    @Value("${file.client.type}")
    private String fileClientType;

    @Value("${file.root}")
    private String fileRoot;

    @Value("${file.biz.file.upload}")
    private String uploadDir;

    @Value("${file.biz.file.download}")
    private String downloadDir;

    @Value("${file.biz.file.backup}")
    private String backupDir;

    @Value("${file.biz.file.tmp}")
    private String tmpDir;

    @Value("${file.biz.file.breakpoint}")
    private String breakpointDir;
}

7、application.properties

eureka.instance.instance-id=${spring.application.name}:${server.port}
eureka.instance.prefer-ip-address=true
eureka.client.serviceUrl.defaultZone=http://127.0.0.1:32001/eureka/
server.port=32018
spring.application.name=openailab-file-management
#file
file.client.type = ceph
file.root = /usr/local/oas/file
file.biz.file.upload = /upload
file.biz.file.download = /download
file.biz.file.backup = /backup
file.biz.file.tmp = /tmp
file.biz.file.breakpoint = /breakpoint
#ribbon
ribbon.ReadTimeout=600000
ribbon.ConnectTimeout=600000
#base
info.description=文件管理服务
[email protected]

spring.servlet.multipart.enabled=true
spring.servlet.multipart.max-file-size=5120MB
spring.servlet.multipart.max-request-size=5120MB

8、表结构

字段名 注释 类型 长度 是否必填 是否主键
id 主键ID,sequence(course_resource_id_seq) int 32
type 资源类型,1:视频;2:文档;3:图片 int 2
fileName 文件名称 varchar 100
fileSize 文件大小 int 64
filePath 文件路径 varchar 200
status 0:无需转码 1:转码中 2:已转码 3:未上传完成 4:已上传完成 -1:转码失败 int 2
createDate 创建时间 timestamp 0
createUser 创建用户 varchar 50
isDelete 是否删除:0未删除,1已删除 int 2
userId 用户ID int 32
fileMd5 文件唯一标识(webupload文件md5唯一标识) varchar 100

文件断点续传到此就介绍完了,完整前后端代码下面传送门:

断点续传前端代码下载(WebUploader)

断点续传后端代码下载(Java)

发布了341 篇原创文章 · 获赞 376 · 访问量 36万+

猜你喜欢

转载自blog.csdn.net/qq_19734597/article/details/104059986