| |

VerySource

 Forgot password?
 Register
Search
View: 1090|Reply: 9

tadoquery memory release

[Copy link]

1

Threads

2

Posts

3.00

Credits

Newbie

Rank: 1

Credits
3.00

 China

Post time: 2020-2-12 14:00:01
| Show all posts |Read mode
Table a has 500,000 data, all taken out at one time. Put into table b after processing

At the beginning of execution, the memory (512M) occupancy rate was 30%. As the amount of processed data increased, the memory occupancy rate continued to increase (almost all 512M was used up after processing about 80,000 data). I don't know what caused it. Is there any solution, please enlighten me. .
Reply

Use magic Report

0

Threads

1

Posts

2.00

Credits

Newbie

Rank: 1

Credits
2.00

 China

Post time: 2020-4-26 15:00:01
| Show all posts
It seems that you haven't done this before, if you update the statement will be better, otherwise it is best not to take out such a large
Reply

Use magic Report

0

Threads

4

Posts

3.00

Credits

Newbie

Rank: 1

Credits
3.00

 China

Post time: 2020-4-26 16:00:01
| Show all posts
Attendance Record?
Reply

Use magic Report

1

Threads

5

Posts

5.00

Credits

Newbie

Rank: 1

Credits
5.00

 China

Post time: 2020-4-27 09:15:02
| Show all posts
1. There is a problem with the program itself, which has nothing to do with the amount of data;
2.If it is related to the amount of data, you can find an ID and process it in batches
Reply

Use magic Report

0

Threads

14

Posts

9.00

Credits

Newbie

Rank: 1

Credits
9.00

 China

Post time: 2020-4-28 11:30:01
| Show all posts
The quality of the program is high, and there are not many 500,000 records, but you can consider batch processing
Reply

Use magic Report

0

Threads

53

Posts

29.00

Credits

Newbie

Rank: 1

Credits
29.00

 China

Post time: 2020-5-1 19:45:01
| Show all posts
Follow the paging browse, you can divide the data into several batches, and choose the batching unit according to your memory size. Execute step by step, release the previous batch of data every time it is executed!
Reply

Use magic Report

0

Threads

40

Posts

27.00

Credits

Newbie

Rank: 1

Credits
27.00

 China

Post time: 2020-5-2 10:30:02
| Show all posts
Set ADO with large amount of data to CursorType = ctOpenForwardOnly

May reduce the memory footprint, but the data set can only be moved in one direction
Reply

Use magic Report

1

Threads

2

Posts

3.00

Credits

Newbie

Rank: 1

Credits
3.00

 China

 Author| Post time: 2020-5-7 12:00:02
| Show all posts
500,000 is already processed in batches. There are millions of common data.

There is a field in the a table to identify whether the record has been manipulated. The initial value is 0. I use select to fetch 500,000 data. Each time I process a record, I assign a value of 1 to this record. Then there is the problem of the title.
Reply

Use magic Report

0

Threads

2

Posts

3.00

Credits

Newbie

Rank: 1

Credits
3.00

 China

Post time: 2020-7-9 19:30:01
| Show all posts
Learning~!
Reply

Use magic Report

0

Threads

53

Posts

29.00

Credits

Newbie

Rank: 1

Credits
29.00

 China

Post time: 2020-7-13 16:45:01
| Show all posts
It seems that this has something to do with your code. It is estimated that your processing is carried out one by one, so the records that have been processed in the north are not released, and will not be released until your batch of records is processed. Of course, the memory will continue to increase. And you also load a lot of fields, it is recommended that you do not load unnecessary fields. At the same time, further batching, according to your test, I think it should be better in 40,000 batches!
Reply

Use magic Report

You have to log in before you can reply Login | Register

Points Rules

Contact us|Archive|Mobile|CopyRight © 2008-2023|verysource.com ( 京ICP备17048824号-1 )

Quick Reply To Top Return to the list