Custom excel import

Hi everybody:

I have two tables, master/detail:
master: masterid, field1, field2, …
detail: detailid, masterid, fdetail1, fdetail2, …

I have a client webpage, where I download excel reports, and they export as:
row1: master1, field1, field2, …, detail1, fdetail1, fdetail2, fdetail3, …, detail2, fdetail1, fdtail2, …
row2: master2, field1, field2, …, detail1, fdetail2, …, … detail2, fdetail1, fdtail2, …


I mean, they have a masterid row, and then a lot of colums, where they repeat the same colums to add more details.
Well, I need to get the excel file and:
1.- check the masterid, if it exists update the data. If not exist, add a new one.
2.-iterate in the same row throught the colums, and add or update details.

The excel file has two headers rows.
I’ve been reading about Row_Import and Page_Importing, some threads about import and making some test, but still not sure if I could make such things in these events whiile importing the excel file into the detail page, or should I make a custom script.

Thanks a lot in advance

josejad wrote:

and making some test, but still not sure if I could make such things in these events

Please post your code for more discussion.

Thanks (as always) for your answer.
Some things I miss before, using v2019

I’ve started simply triying to see what I get, but I can’t see any result, sometimes I get (Server error 200: OK), sometimes (changing the code) just OK.
I’ve used F12 → Network but I can’t get anything, no var_dump to see what I’m doing

// Page Importing event
function Page_Importing($reader, &$options) {
$options[“offset”] = 2;
$options["headerRowNumber "] = 1;
var_dump($reader); // Import data reader
var_dump($options); // Show all options for importing
//return FALSE; // Return FALSE to skip import
die();
return TRUE;
}

// Row Import event
function Row_Import(&$row, $cnt) {

ExecuteScalar(“INSERT INTO jobs(jobid, Name, Job_Type) VALUES (” . $row[“id”] .“,'” .$row[“Name”].“‘,’”.$row[“description”] . “'”);
echo $cnt; // Import record count
var_dump($row); // Import row
//return FALSE; // Return FALSE to skip import
die();
return FALSE;
}

josejad wrote:

ExecuteScalar(“INSERT INTO jobs(jobid, Name, Job_Type) VALUES (” . $row[“id”]
.“,'” .$row[“Name”].“‘,’”.$row[“description”] . “'”);

should be:
Execute(“INSERT INTO jobs(jobid, Name, Job_Type) VALUES (” . $row[“id”] .“,'” .$row[“Name”].“‘,’”.$row[“description”] . “'”);

Yes, you’re right, I have to use Execute instead of Execute scalar.
I’ve started with an excel file exported from my database to test with just 1 record, before doing with the original one, so now the headers in the excel are the same that in the db.
I can import into my detail table, and insert the job in the master table.
Despite having uncommented the var_dump and echo, I can’t see what’s happening.

Well, so I’ve started again with the client file.
Now, in Row_Import I want to process the row data as I want.
First step I want to try is knowing if the masterID exists or not to update or insert.
The first row in the file is “id” (it’s the master id), so I want to try
function Row_Import(&$row, $cnt) {
$jobID = ExecuteScalar("SELECT jobid FROM jobs WHERE jobid = " . $row[“ID”]) . “)”;
echo $cnt; // Import record count
var_dump($row); // Import row
return FALSE;
}
But I get an error message: “Invalid field name: id”. I’ve tried to return FALSE, but same result.
How can avoid Row_Import make is job and do what I want?

I’ve tried to see $jobID in Page_Imported, but no result:
function Page_Imported($reader, $results) {
$msg = “jobID” . $jobID; // Set up import message
WriteAuditTrail(“log”, DbCurrentDateTime(), ScriptName(), CurrentUserID(), $msg, CurrentUserIP(), “”, “”, “”, “”); // Write audit trail
var_dump($reader); // Import data reader
var_dump($results); // Import results
}
}

I’ve found this post I haven’t found before, and it helps me to understand some things:
http://www.hkvforums.com/viewtopic.php?f=4&t=45584&p=144391
I can use the first rows in the excel file to insert or update the master table, and I’ve seen I can build my own row.
My problem is my file has a few colums with the master details, and after that, 1500 colums with a lot of details. Same colums repeats every 22 colums so the row name are not uniques.

row1: master1, field1, field2, …, detail1, fdetail1, fdetail2, fdetail3, …, detail2, fdetail1, fdtail2, …
So, how could I insert all the details every single row?

Thanks

josejad wrote:

1500 colums with a lot of details

Are you sure you have 1500 columns?

mobhar wrote:

Are you sure you have 1500 columns

I’m afraid yes. From A to BGH: 1538 columns.

I’m doing a workaround, a VBA excel macro to create a master file import and a detail file import.
I’m having some issues importing (some error with dates, solved with dateformat in row_import event). After that, I get error in all lines (triying with six), (ok: 0, failed: 6), but no clue why.
I’m trying to see $results[“failList”] with no success right now. (I get Array, or “1” using print_r).

Next problem will be… to update the imported data (firsts fields in my db) and let my own fields in the data base (the lasts ones) as is.

Thanks