Icon View Thread

The following is the text of the current message along with any replies.
Messages 1 to 6 of 6 total
Thread DBISAM Transactional Lock
Tue, Oct 7 2014 5:12 AMPermanent Link

Danie van Eeden

Hi everyone,

stuck with the following:

We need to posted a bunch of data to a single transaction table.
We do this in an isolated database transaction ie. Start Transaction/ Commit / Rollback.

The problem is that the amount of data results in an "out of memory error".
First resolution is usually to split the postings into acceptable batch sizes. However in this case I cannot due to data integrity issues.

What alternative solutions do I have?
Are there ways to configure the database engine to allow for a safe automated method? Are there other workaround outside of the database configuration?

Your help appreciated.

Kind regards
Tue, Oct 7 2014 6:02 AMPermanent Link

Jose Eduardo Helminsky

HPro Informatica

Danie van Eeden

<<
We need to posted a bunch of data to a single transaction table.
>>
How big is the data ? How many records are involved ?

One solution is exactly what you told about apply Commit for each 5k records for example.

Other solution is create a control field to handle if the record is valid or no. Then after all the data is inserted you can change this field and the records will be valid for your process.

Eduardo (HPro)
Tue, Oct 7 2014 6:10 AMPermanent Link

Danie van Eeden

Hi Jose,
thanks for the reply.

Data is anywhere between 250 000 and 1 million records (please don't ask why Smilebut it remains unavoidable)

Not sure I understand you second solution? Could you perhaps explain with simple example?

Danie

Jose Eduardo Helminsky wrote:

Danie van Eeden

<<
We need to posted a bunch of data to a single transaction table.
>>
How big is the data ? How many records are involved ?

One solution is exactly what you told about apply Commit for each 5k records for example.

Other solution is create a control field to handle if the record is valid or no. Then after all the data is inserted you can change this field and the records will be valid for your process.

Eduardo (HPro)
Tue, Oct 7 2014 7:58 AMPermanent Link

Matthew Jones

Danie van Eeden wrote:

> Not sure I understand you second solution? Could you perhaps explain
> with simple example?

I can't speak for Jose, but I expect it is similar to something I do in
one application that needs a daily data update. I have a field with
"status". A data refresh (import) starts by reading in the data and
creating records with the status set to "new". This can take however
long it needs, because the live processing queries only get results
where the status is marked as "live". So the import takes 5 mins, and
can then do some processing on the data, perhaps marking the records as
"ready". Then, when the import is all complete, I start a transaction
that sets all "live" records to "old", and all "ready" records to
"live". That then completes, or not, independent of how much data there
is. At the end you either have the new or the old data, and nothing
partial. With all happy, I tend to leave it as-is, with old and live
sitting there for a day. So add a new step to the start of import -
delete all records that are not "live", to clear out old and partially
imported data.

Works nicely for me. I hope I explained my variation sufficiently.

--

Matthew Jones
Tue, Oct 7 2014 8:51 AMPermanent Link

Jose Eduardo Helminsky

HPro Informatica

Danie van Eeden

<<
Not sure I understand you second solution? Could you perhaps explain with simple example?
>>

The idea is already well described by Matthew Jones

Use a field status and for each insert you can put an asterisc (*) in this field. After the process is done then you clean this field. If the process is broken just delete the records with *.

Eduardo (HPro)
Tue, Oct 7 2014 10:36 AMPermanent Link

Danie van Eeden

Hi thanks for the replies thus far.

I understand you method here. I have to weigh this up and compare to our existing codebase to see if it's an option.

Danie
Image