Spring Batch Questions

I have a requirement in my project where I want to process data in fixed length files. The data file contains one header line and many detail lines. The title bar contains summary information and consolidates the detail information. for example, reporting period, employer's report, total amount, etc. The detail line contains information for each individual employee. e.g. employee contribution, contribution period, etc. There will be many data files from different employers to be processed by the job.

So, I created a one-step job with the following readers and writers and other custom classes a. MultiResourceItemReade to read all files in the folder. b. FlatFileItemReader to read each file. Delegated from MultiResourceItemReader. from. I skip the first line and process the LineCallbackHandler d. I can parse the title line and convert it to a Report object. e. I am using DefaultLineMapper and BeanWrapperFieldSetMapper to parse the string and convert it to a MemberRecrod object.

I need help achieving the following use of spring package.

  • I want one Report object to be available in the ItemWriter for each file processed in the folder. So I can add all MemberRecord objects [detail lines] to the Report object and save it to the DB [I'm hibernating for the ORM]. I tried to do this by adding a Report object to the JobExecutionContext and accessing it in the ItemWriter. So I extended the StepExecutionListenerSupport from the MemberRecordHeaderLineHandler class [this implements LineCallbackHandler]. And override the beforeStep method. I can get the JobExection object in the beforeStep method and I store the JobExection object for a local variable in the MemberRecordHeaderLineHandler class. But when the control navigates to the handleLine method, the JobExection variable is NULL. I am parsing the title line and converting it to a Report object in the handleLine method.Since the JobExection is null, I cannot add the Rerpot object to the JobExecutionContext object. I'm not sure how to pass the Report object to the ItemWriter. Please advise me how the values ​​are from LineCallBackHandler to ItemWriter.

I also need to suggest how to implement the following functionality using spring Batch.

  • I currently have sample data in a folder under the webinf / conf / data folder. Ideally, I want to process all files from an FTP location. How to specify FTP folder location for resource property.
  • After successfully processing each file, I need to archive the file to a different folder. How to archive files using spring Batch.
  • If there are any exceptions due to incorrect data format, I need to update the record in the database and move the file with the error to the "Error" folder. I do not want work to be stopped due to this error. I want to continue processing other files. How to handle exceptions in this case.

The xml job file.                               

    <bean id="erLoadFolderReader" class="org.springframework.batch.item.file.MultiResourceItemReader" scope="step">
        <property name="resources" value="#{jobParameters['FILE_NAME']}" />
        <property name="delegate" ref="erLoadFileReader" />
        <property name="saveState" value="false" />
    </bean>

    <bean id ="memberRecordHeaderLineHandler" class="com.htcinc.rs.batch.infrastructure.erLoadJob.MemberRecordHeaderLineHandler" />
    <bean id="erLoadFileReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step">
        <property name="saveState" value="false" />
        <property name="resource" value="#{jobParameters['FILE_NAME']}"  /> 
        <property name="linesToSkip" value="1" />
        <property name="skippedLinesCallback">
            <bean class="com.htcinc.rs.batch.infrastructure.erLoadJob.MemberRecordHeaderLineHandler">
                <property name="wcReportService" ref="wcReportService" />
                <property name="names" value="empNo,planCode,startDate,endDate,totalEmprContrb,totalEmplContrb,reportType" />
                <property name="headerTokenizer">
                    <bean class="org.springframework.batch.item.file.transform.FixedLengthTokenizer">
                        <property name="names" value="organizationCode,planCode,beginDate,endDate,totalEmployerContribution,totaEmployeeContribution,reportingType"></property>
                        <property name="columns" value="1-9,10-17,18-25,26-33,34-48,49-63,64-67" />
                    </bean>
                </property>
            </bean>
        </property>
        <property name="lineMapper">
            <bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
                <property name="lineTokenizer">
                    <bean class="org.springframework.batch.item.file.transform.FixedLengthTokenizer">
                        <property name="names" value="ssn,firstName,lastName,middleName,birthDateText,genderCode,addressStartDateText,addrLine1,addrLine2,addrLine3,city,state,zip,zipPlus,wagesText,employerContributionText,employeeContributionText,recordType,startDateText,endDateText,serviceCreditDaysText,serviceCreditHoursText,jobClassCode,positionChangeDateText,hireDateText,terminationDateText,notes" />
                        <property name="columns" value="1-9,10-29,30-59,60-79,80-87,88-88,89-96,97-126,127-146,147-166,167-181,182-183,184-188,189-192,193-205,206-214,215-223,224-227,228-235,236-243,244-246,247-251,252-255,256-263,264-271,272-279,280-479" />
                    </bean>
                </property>
                <property name="fieldSetMapper">
                    <bean
                        class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
                        <property name="targetType"
                            value="com.htcinc.rs.domain.batch.MemberRecord" />
                    </bean>
                </property>
            </bean>
        </property>
    </bean>
    <bean id="memberRecordItemWriter" class="com.htcinc.rs.batch.infrastructure.erLoadJob.MemberRecordItemWriter" />
    <bean id="memberRecordItemProcessor" class="com.htcinc.rs.batch.infrastructure.erLoadJob.MemberRecordItemProcessor" />

    <batch:job id="erLoadJob">
        <batch:step id="erLoadJob_step1">
            <batch:tasklet>
                <batch:chunk reader="erLoadFolderReader" writer="memberRecordItemWriter" processor="memberRecordItemProcessor" commit-interval="1" />
            </batch:tasklet>
            <batch:listeners>
                <batch:listener ref="memberRecordHeaderLineHandler"/>
            </batch:listeners>          
        </batch:step>
    </batch:job>
</beans>

      

MemberRecordHeaderLineHandler.Java File

    private WCReportServiceDefaultImpl wcReportService;
    private WCReport wcReport;
    private JobExecution jobExecution;
    private LineTokenizer headerTokenizer;
    private String names;

    public WCReportServiceDefaultImpl getWcReportService() {
        return wcReportService;
    }

    public void setWcReportService(WCReportServiceDefaultImpl wcReportService) {
        this.wcReportService = wcReportService;
    }

    public LineTokenizer getHeaderTokenizer() {
        return headerTokenizer;
    }

    public void setHeaderTokenizer(LineTokenizer headerTokenizer) {
        this.headerTokenizer = headerTokenizer;
    }

    public String getNames() {
        return names;
    }

    public void setNames(String names) {
        this.names = names;
    }

    @Override
    public void handleLine(String headerLine) {
        FieldSet fs = getHeaderTokenizer().tokenize(headerLine);
        String datePattern = "MMddyyyy";
        Date defaultDate = Utility.getDefaultDate();
        try {
            wcReport = wcReportService.getWCReport(Integer.toString(fs.readInt("organizationCode")), fs.readString("planCode"),fs.readDate("beginDate", datePattern, defaultDate), fs.readDate("endDate", datePattern, defaultDate), fs.readString("reportingType"));
        } catch (Exception e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
        if(jobExecution != null && wcReport != null) {
            ExecutionContext jobContext = jobExecution.getExecutionContext();
            jobContext.put("WCREPORT_OBJECT", wcReport);
        }
    }

    @Override
    public void beforeStep(StepExecution stepExecution) {
        this.jobExecution = stepExecution.getJobExecution();
    }

      

MemberRecordItemWriter.java File

    private int iteration = 0;
    private JobExecution jobExecution;

    @Override
    public void write(List<? extends MemberRecord> records) throws Exception {
        System.out.println("Iteration-" + iteration++);
        Object wcReport = jobExecution.getExecutionContext().get("WCREPORT_OBJECT");
        for (MemberRecord mr : records) {
            //System.out.println(header);
            System.out.println(mr.getLastName());
        }       
    }

    @BeforeStep
    public void beforeStep(StepExecution stepExecution) {
        this.jobExecution = stepExecution.getJobExecution();
    }

      

Thanks, Vijay

+3


source to share


1 answer


When processing incoming files from ftp, then you need to integrate Spring Integration and Spring Batch to create an event based system.



  • Configure Spring Integration to listen to your ftp server.
  • Spring Integration will run the job after detecting an incoming file.
  • Spring Package will process the incoming file and save the data to the database.
  • Note. To add archiving functionality you need to add it to either Spring Batch JobListener or StepListener
+1


source







All Articles