Skip to content

Is running sql statement on larger datasets broken? #606

@agile-rails

Description

@agile-rails

I lost a day on this problem. First I thought that the problem was the Ruby 4.0 because I just upgraded to it.

I got strange segfaults and errors that didn't make any sense when running simple query like

qry = 'SET TEXTSIZE 2147483647;select id, data from my_table'
@data = conn.execute(qry).each_with_object({}) { |data, obj|obj[data['id']] = data['data'] }

Than I found this thread #595 where it stands:

"Our insert and do method, currently implemented on the Result class, perform the entire sequence. However, with execute, this is intentionally not done to allow lazy-loading of results from the server. This can lead to errors, some intended, others not "

OK. Maybe reading all data at once is the problem. I changed the program, which now looks like this. Which is both ugly and much slower:

def read_a_block(qry, offset)
  my_sql = "SET TEXTSIZE 2147483647;#{qry} offset #{offset} rows fetch next 1000 rows only"
  conn.execute(my_sql).to_a
end

@data, offset = {}, 0
while true
  data = read_a_block('select id, data from my_table', offset)
  break if data.size == 0

  data.each { @data[it['id']] = it['data'] }
  offset += data.size
end

but it works.

I don't know what has been done, but please try to restore the old functionality.

by
TheR

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions