Skip to content

Not able to process the large data (more than 10k records) when processing using batch_size from Oracle #161

@aravinreizend

Description

@aravinreizend

Hi

Note: I have upgraded to 1.4 using #148

I am facing below issue for more than 2 weeks. Tried many ways to find the issue but cant able to fix.
Please help us to fix the issue. For small tables it is working without batch_size definition.

If i change the batch_size to my total count it will work for small tables. But when 2 million total records, when i set batch_size to 50000 it is not working. But first 50000 records has been processed successfully , remaining records are marked as complete but it did not copy to my mongodb. In the below oracle.config file i tried to change the numbers but it is not working for large tables.

Here is .config file

file name : oracle.config

sql_connection do
adapter "oracle_enhanced"
host "xxx.xx.x.xxx"
port "1525"
username ""
password "
*"
database "DEV"
#batch_size 20000 # This is defaulted to 10000 but in case you want to make that smaller (on lower RAM machines)
#Uncomment the following line if you get a "String not valid UTF-8" error.
encoding "utf8"
end

mongodb_connection do
host "xxx.xx.x.xxx"
database "eastern_oracle"
#Uncomment the following line if you get a "String not valid UTF-8" error.
encoding "utf8"
end

file: oracle_translation.rb

table "ra_customer_trx_all" do
column "customer_trx_id", :key, :as => :integer
column "trx_number", :string
column "trx_date", :string
column "exchange_rate", :string
end

i have attached the error as screenshot here
issue

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions