Parse Pipe delimited File with Spring boot

NormX :

I have an excel file that has pipe delimited data in column A for for a number rows. The first row has the date then after that is the data. Each row has extra data but I only want the first 3 fields in each row(account name, account number, account domain). The last row has the row count number in each file. Below is a example of file

20200310|
Mn1223|01192|windows|extra|extra|extra||
Sd1223|02390|linux|extra|extra|extra||
2

I created a spring boot java application and set up my database configurations. I need help creating a service that is able to parse the file and insert the data into a table and compare to the row count to match. I'm following this example for guidance but my file isn't a csv https://howtodoinjava.com/spring-batch/csv-to-database-java-config-example/

Léo Schneider :

You can parse the file like this:

List<DataToInsert> parseData(String filePath) throws IOException {

        List<String> lines = Files.readAllLines(Paths.get(filePath));

        // remove date and amount
        lines.remove(0);
        lines.remove(lines.size() - 1);

        return lines.stream()
                .map(s -> s.split("[|]")).map(val -> new DataToInsert(val[0], val[1], val[2])).collect(Collectors.toList());
    }

And with your class like this:

class DataToInsert {
    private final String accountName;
    private final String accountNumber;
    private final String accountDomain;

    public DataToInsert(String accountName, String accountNumber, String accountDomain) {
        this.accountName = accountName;
        this.accountNumber = accountNumber;
        this.accountDomain = accountDomain;
    }

    public String getAccountName() {
        return accountName;
    }

    public String getAccountNumber() {
        return accountNumber;
    }

    public String getAccountDomain() {
        return accountDomain;
    }
}

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=197101&siteId=1