One of the most common headaches when you integrate your IBM i with the outside world is converting data between EBCDIC and ASCII. You get a flat file from a PC, pull JSON from a web service, or process input from another platform and suddenly your strings look like garbage. The good news is that SQLRPGLE gives you clean, efficient ways to handle both directions without any fuss.
Here are the best approaches, starting with the tried and true QDCXLATE option.
IBM-i API – QDCXLATE
Legacy method: QDCXLATE API (still works, but not recommended)
If you are on an older release or maintaining very old code you can fall back to the system program QDCXLATE. It uses translation tables that ship with the system.
Cide might look something like this:
dcl-pr QDCXLATE extpgm('QDCXLATE');
Length packed(5:0) const;
Data char(32767) options(*varsize);
Table char(10) const;
end-pr;
// ASCII to EBCDIC
QDCXLATE(50 : MyData : 'QTCPEBC');
// EBCDIC to ASCII
QDCXLATE(50 : MyData : 'QTCPASC');
In this example RPG program – you can see we are reading an 80-character flat file in EBCDIC format and writing it back to an output file in ASCII:
**FREE
// Program: CONVERSION
// Description: Convert EBCDIC data to ASCII data (80-byte records)
// Author: Nick Litten
// Created: 2026-03-25
ctl-opt dftactgrp(*no) actgrp(*new) option(*nodebugio:*srcstmt);
// File Declarations
dcl-f filein disk(*ext) usage(*input) keyed;
dcl-f fileout disk(*ext) usage(*output);
dcl-c RECORD_LENGTH 80;
dcl-c TRANSLATION_TABLE 'QTCPASC';
// Global Variables
dcl-s recordData char(RECORD_LENGTH);
dcl-s outputData char(RECORD_LENGTH);
dcl-s endOfFile ind inz(*off);
dcl-s recordsProcessed packed(9:0) inz(0);
// Prototypes
dcl-pr Translate extpgm('QDCXLATE');
length packed(5:0) const;
data char(32766) options(*varsize);
table char(10) const;
end-pr;
// Main Processing Logic
// Read and process each record until end of file
read filein;
dow not %eof(filein);
convertAndWriteRecord();
recordsProcessed += 1;
read filein;
enddo;
// Normal termination
*inlr = *on;
return;
//------------------------------------------------------------------------------
// Convert single record from EBCDIC to ASCII and write to output
//------------------------------------------------------------------------------
dcl-proc convertAndWriteRecord;
dcl-ds fileInRec extname('FILEIN') qualified end-ds;
dcl-ds fileOutRec extname('FILEOUT') qualified end-ds;
// Clear work areas
recordData = *blanks;
outputData = *blanks;
// Get input record data
recordData = fileInRec.record;
// Perform EBCDIC to ASCII translation
Translate(RECORD_LENGTH: recordData: TRANSLATION_TABLE);
// Prepare output record
outputData = recordData;
fileOutRec.out = outputData;
// Write converted record to output file
write fileout fileOutRec;
end-proc;
The tables QTCPEBC and QTCPASC live in QSYS. It works, but the modern CCSID method is cleaner, faster, and supports far more code pages without extra maintenance.
IBM-i Native – CCSID Keyword
The modern way (IBM i 7.2 and higher): Use the CCSID keyword
Since IBM i 7.2 you can declare character variables with a specific CCSID. The system handles the conversion automatically whenever you assign values between fields that have different CCSIDs. No APIs, no tables, just clean assignment.
Here is EBCDIC to ASCII:
rpgle
dcl-s EbcdicText char(50);
dcl-s AsciiText char(50) ccsid(819);
// 819 = ISO 8859-1 Latin-1 ASCII
EbcdicText = 'Hello World from IBM i';
// automatic conversion
AsciiText = EbcdicText;
And the other way around (ASCII to EBCDIC):
rpgle
dcl-s AsciiInput char(50) ccsid(819);
dcl-s EbcdicOutput char(50);
AsciiInput = someAsciiDataFromFileOrSocket;
// converts back to native EBCDIC
EbcdicOutput = AsciiInput;
This works beautifully in SQLRPGLE programs. You can read ASCII data into a CCSID(819) host variable, process it in native EBCDIC, then write it back out the same way. Super performant and self-documenting.
Here is the same program as before – but using CCSID rather than QDCXLATE:
**FREE
// Program: CONVERSION-CCSID
// Description: Convert EBCDIC data to ASCII data using CCSID (80-byte records)
// Author: Nick Litten
// Created: 2026-04-01
// Notes: Simplified version using CCSID instead of QDCXLATE API
ctl-opt dftactgrp(*no) actgrp(*new) option(*nodebugio:*srcstmt);
// File Declarations
dcl-f filein disk(*ext) usage(*input) keyed;
dcl-f fileout disk(*ext) usage(*output);
dcl-c RECORD_LENGTH 80;
dcl-c EBCDIC_CCSID 37; // EBCDIC US/Canada
dcl-c ASCII_CCSID 819; // ISO 8859-1 (Latin-1)
// Global Variables
dcl-s recordsProcessed packed(9:0) inz(0);
// Data structures with CCSID specifications
dcl-ds fileInRec extname('FILEIN') qualified end-ds;
dcl-ds fileOutRec extname('FILEOUT') qualified end-ds;
dcl-s ebcdicData char(RECORD_LENGTH) ccsid(EBCDIC_CCSID);
dcl-s asciiData char(RECORD_LENGTH) ccsid(ASCII_CCSID);
// Main Processing Logic
// Read and process each record until end of file
read filein;
dow not %eof(filein);
convertAndWriteRecord();
recordsProcessed += 1;
read filein;
enddo;
// Normal termination
*inlr = *on;
return;
//------------------------------------------------------------------------------
// Convert single record from EBCDIC to ASCII and write to output
//------------------------------------------------------------------------------
dcl-proc convertAndWriteRecord;
// Get input record data (EBCDIC)
ebcdicData = fileInRec.record;
// Automatic conversion happens via CCSID assignment
asciiData = ebcdicData;
// Prepare and write output record
fileOutRec.out = asciiData;
write fileout fileOutRec;
end-proc;
Bottom line: if you are on IBM i 7.2 or newer, go straight to the CCSID keyword on your variables. It is the way IBM wants you to handle character conversion these days and it plays perfectly with embedded SQL. Your future self (and the next developer who touches the code) will thank you.
Which method are you using today? Have you run into any tricky cases with special characters or different CCSIDs?
