Highlighted
Absent Member.
Absent Member.
1706 views

Handling Large Data (3 Million) in UFT

Hello All, Need your thoughts/inputs

 

Can UFT handle the below framework?

"Import 3 million records ( each row with 700 fields ) and compare each with by connecting to SQL(via ADODB)"

 

Project details:

Application accepts a flat file with 3 million records(each record with 700 fields) and loads it in DB  based very complex business rule for each type of data.

 

We have to validate every records in the flat file and valdiate if every field is loaded in DB as per the business rule.

Its impossible to validate manually so we want to check if we can validate using UFT.

 

if it is few thousand member we have Stored procedures written to validate it, but not for a million record and this is where we want to incorporate UFT.

 

 

To simplify, Can I  use UFT to import this large load and compare the data loaded in DB?

 

 

Any other alternte way with UFT? 

 

Thanks 

Rampraveen

0 Likes
6 Replies
Highlighted
New Member.

Re: Handling Large Data (3 Million) in UFT

I believe with UFT, you would face performance problems with this amount of data but you can try.

 

I suggest you to utilize Microsoft Excel in-built options to import the data and you can write simple macro function to compare the records. This will be very easy job too.

 

To Import data from DB to Excel:

1. Go to Excel - Data tab

2. From Other Resources >> Use Data Connection Wizard

3. Select DB type and Provide your details

4. It will import the data easily and you can see the progress. Also latest versions of excel will handle lot of data easily.

5. Now you can simply write some macro function (VBA) to do ForLoop with rows and ForLoop with Columns and compare.

6. Ensure you comparision should be based on data type of each field. 

Accept as solution/Give Kudos to the author if the piece of information answered/helped you to solve your problem.
All the thoughts here are mine not related to my employer nor anyone.
Test Automation Ecstasy
Highlighted
Absent Member.
Absent Member.

Re: Handling Large Data (3 Million) in UFT

The requirement is sounds like a ETL automation testing. UFT is good for UI test automation or some minimal data validation,but not for millions of records. You may want to write a wrapper either using VB/VBA or Java by connecting ADODB/DAO or JCBC in case of Java. 

 

Good luck

Shashy

 

 

0 Likes
Highlighted
Absent Member.. Absent Member..
Absent Member..

Re: Handling Large Data (3 Million) in UFT

Import test case cata into another database and do a db to db comparison, or write the data to ascii files parse them at runtime for the comparison. Either method will be much faster than excel and both can be done in UFT. Exel will be god awful slow...

Highlighted
Absent Member.
Absent Member.

Re: Handling Large Data (3 Million) in UFT

Thanks for the reply shanm,

 

But the idea is not to bring in EXCEL, provided the size of data it will be difficult and that is the reason of choosing UFT and we are exploring possibilities in it

 

0 Likes
Highlighted
Outstanding Contributor.
Outstanding Contributor.

Re: Handling Large Data (3 Million) in UFT


@Shashy wrote:

The requirement is sounds like a ETL automation testing. UFT is good for UI test automation or some minimal data validation,but not for millions of records. You may want to write a wrapper either using VB/VBA or Java by connecting ADODB/DAO or JCBC in case of Java. 

 

Good luck

Shashy

 

 


I agree with Shashy. You should look away from UFT for this purpose.

_____________________
Rajkumar Rajangam
0 Likes
Highlighted
Absent Member.
Absent Member.

Re: Handling Large Data (3 Million) in UFT

As there's no GUI involved, you don't really need UFT - your automator could easily do the same thing purely in VBScript - the code would practically be the same. It would run faster in pure VBScript, and you wouldn't have memory issue caused by the UFT Client. I'd recommend a 64-bit machine to run it on, and build in some kind start-point variable - if you got a million records in and then hit a bug, you wouldn't want to have to start from the beginning again!

 

VBScript would be fine for this kind of thing - I've done a lot of data migration testing scripts in a similiar vein. The large number of records wouldn't be a big issue, as you shouldn't need to store the whole lot in memory at once. I'd take the approach that others have mentioned - import the data into properly formatted Excel. VB can handle Excel spreadsheets like a champ.

 

Getting VBScript to connect to the DB can be tricky at times, depending which DB app you're using, but there's plenty of help online about establishing connections.

0 Likes
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.