Lost rows with large(ish) TEXT item

I'm using COBOL via Visual COBOL 2013 4.0 to access a MySQL Community 8.0 database. MySQL for Visual Studio 1.2.8, using OpenESQL to connect, Managed COBOL.

The core of my database is a 3-record hierarchy, joined by indexes, the bottom table of which holds variable-length line data, defined as TEXT. My maximum text line is 7060 bytes. I have an application that processes lines of any length perfectly happily through this structure, generating a converted output that clearly 'sees' and processes every byte in every row. However, a parallel process that dumps out the raw data for comparison is losing rows from the cursor where the TEXT length in the row is greater than 1024 bytes.

select (...)
,left(prop_line,1024)    **********
from program_node pnode
inner join element ele on ele.ele_id = pnode.ele_id
inner join property prop on prop.ele_id = ele.ele_id
inner join property_line pline on pline.ele_id = prop.ele_id and pline.prop_id = prop.prop_id                                                      
where program_name = :ws-program-name
order by hierarchy_id,prop.prop_id,prop_line_seq

***** without this 'left', or when changing the 'left' count to 1025 or greater, rows of length > 1024 are completely omitted, with no sqlcode or other indication of an issue. 

All the joins and sequences are indexed.

The only solution I have been able to find is to split the query into parts which I knew worked elsewhere, ie get the first 2/3 tables in an outer cursor, and have a simple cursor on the lowest level(s) for each row from the first.

The behaviour suggests there might be a size limit on an intermediate record, such as sortwork, although I tried removing the 'Order By', which removed a filesort according to 'explain', and this made no difference.

  • This seems like a very specific issue and we would probably require a test case to reproduce the problem in-house. I would suggest that you open up a new incident with Customer Support so that we may look into this.

    One question though;

    What SQL compiler directives are you using and what method of database connection are you using, ODBC or ADO.NET?

  • Thanks.

    The connection is ODBC.

    This is one of those cases that are difficult to provide a consistent scenario for! I was hoping others might have encountered it. Although the issue is quite reproducible in the program that I originally discovered it in, I have since discovered that it only works sometimes even in the process I thought was working. The same cursor, acting on the same data, omits rows in one pass but includes them in another, when TEXT length exceeds 1024.

    I have a subprocess that passes the same data 3 times, the 1st and 3rd using one cursor and the 2nd using the same definition but in a different program. When I hit a premature end of set, I have added a trap to select count(*) and max(length). When this trap is triggered, Count is always > 0, contradicting the sqlcode 100 on first fetch, and length is always > 1024. It's always the first of those 3 passes that it occurs on, for any given row set. 

  • The only known problems and restrictions for MySQL can be found in the docs in the section here:

    Perhaps somebody else has run into something similar but I have not.

    If you can come up with a reproducible test case then we would be happy to take a look into it.

  • Further to this, the problem appears to lie within ODBC. Various users around the 'net report the issue with other languages, eg Python. It occurs both with TEXT and with VARCHAR(8000), and is not restricted to cursors - I added an 'overflow' record for all text beyond 1024 so that I could use a cursor for the scan, but singleton SELECT of that overflow record also encountered the issue, when the overflow text itself exceeded 1024. You get sqlcode 100; paste that precise query into Workbench and a record is found.

    My unsatisfactory workaround was to chop the overflow itself into 8 1024-byte fields, which obviously precludes the use of LIKE, INSTR etc due to possible spanning across a 1024 boundary.