1
synikal
Doing very large DB updates
  • 2004/12/14 3:24

  • synikal

  • Just popping in

  • Posts: 25

  • Since: 2004/12/11


I have some very large data files i need to add to a database. The file includes geographical data, and the file is about 72,000 lines. If i try to update it, of course, the script times out. Does any one have any strategies to update the database?

2
brash
Re: Doing very large DB updates
  • 2004/12/14 3:35

  • brash

  • Friend of XOOPS

  • Posts: 2206

  • Since: 2003/4/10


Do you have admin access to your server?

3
JMorris
Re: Doing very large DB updates
  • 2004/12/14 5:44

  • JMorris

  • XOOPS is my life!

  • Posts: 2722

  • Since: 2004/4/11


Gee, this kinda sounds familiar.

synikal,

RVirtue and I ran into the exact same problem with the Netquery module.

Have you tried uploading the file via phpMyAdmin? If that still times out, you may want to break the file into smaller chunks and import them one at a time.

4
christian
Re: Doing very large DB updates
  • 2004/12/14 8:53

  • christian

  • Just can't stay away

  • Posts: 401

  • Since: 2002/2/24


You know this script Bigdump ?

5
carnuke
Re: Doing very large DB updates
  • 2004/12/14 9:07

  • carnuke

  • Home away from home

  • Posts: 1955

  • Since: 2003/11/5


There is a comprehensive FAQ HERE about this.

6
synikal
Re: Doing very large DB updates
  • 2004/12/14 15:24

  • synikal

  • Just popping in

  • Posts: 25

  • Since: 2004/12/11


I don't have admin access... Before, I wrote a cmd line script and wrote the system admin and asked them to do it.

thanks for the info though

7
RVirtue
Re: Doing very large DB updates
  • 2004/12/14 15:47

  • RVirtue

  • Quite a regular

  • Posts: 246

  • Since: 2004/8/4 9


While the FAQ is good as far as it goes, I'm not sure that its backup/restore advice fully addresses this user's situation. And, unfortunately, when it comes to adding a large amount of new data to a database, the potential issues are manifold.

In addition to the default PHP timeout (30 seconds) mentioned briefly in the FAQ, I've also found (after much head banging) that some service providers impose their own "unadvertised" restrictions on SQL queries, even for so-called "business package" accounts. In fact, my experience also suggests that the XOOPS module installer itself is not without a few quirks of its own when it comes to handling new SQL dB additions that are larger than "normal".

Basically, there are two "workaround" approaches. The first involves using a l-o-n-g series of single record inserts. The phpMyAdmin utility can be made to export data that way if one unchecks its "extended" option. Alternatively, one can split the large insert into smaller "chunks". (In my own case, approximately 2000 records each seemed to satisfy the limits imposed by my service provider, but others may vary, of course.) The latter approach looks something like this:
$query = ...
(
02018'tcp''terminaldb'''0), 
(
02018'udp''rellpack'''0), 
(
02019'tcp''whosockami'''0);"; 
$result =& $dbconn->Execute($query); 
$query = "INSERT INTO ".$PortsTable." VALUES 
(03050'udp''gds_db''gds_db'0), 
(
03051'tcp''galaxy-server''Galaxy Server'0), 
(
03051'udp''galaxy-server''Galaxy Server'0), 
...
$result =& $dbconn->Execute($query);


Hope that helps.

Login

Who's Online

188 user(s) are online (110 user(s) are browsing Support Forums)


Members: 0


Guests: 188


more...

Donat-O-Meter

Stats
Goal: $100.00
Due Date: May 31
Gross Amount: $0.00
Net Balance: $0.00
Left to go: $100.00
Make donations with PayPal!

Latest GitHub Commits