Some Comma Delimited CSV files will contain commas in the data fields and this causes problems in loading the file into our system. I don't think there is a standard function module to do this but I will share a small code snippet which will save someone to re-invent the wheel.
Here is a sample of the comma delimited CSV text file, you can copy below text then paste in notepad and save as c:\Temp\Book1.csv file.
10,Shafiq,"Sacramento, CA"
11,Willie,"New York, NY"
12,Conner,"Seattle, WA"
Here is the sample program which will read this file and store in a table that will preserve the commas in the city field:
Upload from Local PC:
TYPES: BEGIN OF kcde_intern_struc. INCLUDE STRUCTURE kcde_cells. TYPES: END OF kcde_intern_struc. DATA: l_intern TYPE TABLE OF kcde_intern_struc WITH HEADER LINE. DATA: BEGIN OF it_output OCCURS 0, id(10) TYPE c, name(20) TYPE c, location(20) TYPE c, END OF it_output. DATA: wa_output LIKE LINE OF it_output. DATA: v_index TYPE i. FIELD-SYMBOLS : <fs>. CALL FUNCTION 'KCD_CSV_FILE_TO_INTERN_CONVERT' EXPORTING i_filename = 'C:\Temp\Book1.csv' i_separator = ',' TABLES e_intern = l_intern EXCEPTIONS upload_csv = 1 upload_filetype = 2. LOOP AT l_intern. MOVE : l_intern-col TO v_index. ASSIGN COMPONENT v_index OF STRUCTURE wa_output TO <fs>. MOVE : l_intern-value TO <fs>. AT END OF row. APPEND wa_output TO it_output. CLEAR wa_output. ENDAT. ENDLOOP.
Upload from Unix Server
DATA: BEGIN OF itab_upload OCCURS 0, str(4096), END OF itab_upload. DATA: l_intern TYPE TABLE OF kcde_intern_struc WITH HEADER LINE. OPEN DATASET p_path FOR INPUT ENCODING UTF-8 IN TEXT MODE. DO. READ DATASET p_path INTO itab_upload. IF sy-subrc NE 0. EXIT. ENDIF. APPEND itab_upload. ENDDO. CLOSE DATASET p_path. PERFORM separated_to_intern_convert IN PROGRAM saplkcde TABLES itab_upload l_intern USING ','. LOOP AT l_intern. MOVE : l_intern-col TO v_index. ASSIGN COMPONENT v_index OF STRUCTURE wa_output TO <fs>. MOVE : l_intern-value TO <fs>. AT END OF row. APPEND wa_output TO it_output. CLEAR wa_output. ENDAT. ENDLOOP.