Come let us explore together…

Posts tagged ‘Data driven testing framework’

Starting with TestComplete and Automation?


Hi,

After posting few articles in TestComplete I still wonder whether I have explained how to start with TestComplete and basically automate an application. This article might be little basic or late but I am publishing it before is too late.

How to start exploring TestComplete? What are the TestComplete resources available for a beginner which helps one to master it? How to automate an application? – These are the few questions I faced

TestComplete is proprietary test automation tool owned by Smart Bear. This tool is available for trail with 30 days for beginners.

  • Once you installed TestComplete (must include ‘Sample Scripts’ during install) in your machine, navigate to the folder C:\Documents and Settings\All Users\Documents\TestComplete X Samples. Here you can find the set of sample projects available for all the supported languages in TestComplete.
  • The best way to start exploration is to understand sample exercises/scripts. The scripts folder contains sample scripts for all the topics in TestComplete and other folders have supporting applications for the test projects.
  • Use support from Smart Bear and refer articles, screen casts and blog entries.
  • Best practice is to as you learn, share your findings / workaround / issues / doubts to the community or within your team where you can get clarified and be clearer on the topics you have explored.

Someone might feel like – Not enough time to go thru all the resources for understanding? Just start to automate a small application. You will learn as it on the go. (But this is recommended if you have prior experience in Test Automation!!!)

To start with, take a small scenario like editing a notepad and saving (or) automating the windows calculator. Straightaway go for recording the actions on the application and playback it to verify the test running again!!!

By this approach you will understand how TestComplete reacts to your actions to achieve the scenario if dive in to the recorded test scripts.

Regarding this, I have written an entry in TestComplete blog space on ways to speed up our automation efforts and few framework related points as “Rapid Test Automation using TestComplete”.

If you are already known with some of test automation tools, make a comparative or case study to understand exactly what TestComplete will do or not.

Once we are familiar with TestComplete as an automation tool, try to concentrate on any one of the scripting language it supports. Just for your reference – It supports 5 scripting languages: Jscript, VBScript, C# script, C++ script and Delphi script.

You may refer these links for your exploration,

Jscript – http://msdn.microsoft.com/en-us/library/hbxc2t98%28VS.85%29.aspx

VBScript – http://msdn.microsoft.com/en-us/library/t0aew7h6%28VS.85%29.aspx

Please free to comment or  share how you have started your exploration…

Thanks & Regards,
Giri Prasad
Believe!!! There are 101 ways to automate…

Data handling support in TestComplete – Part 2


2. Stores:

In TestComplete each project can have its own ‘Stores’ test item. It is a collection of various file types which can be accessed by project scripts. It can be directly added to any project by simply right clicking the <Project_Name> -> Add -> New Item -> Stores.

Collection of files, objects, and images organized as Stores.

Stores_as_project_item

The advantage of having such collection is, it gives access to complete repository of required data in various data formats. Mostly it can be used for verification of expected results and data source for across AUT.

The stores collections can accommodate data in below formats,

  • DBTables – To store tables and queries.
  • Files – To store Text or Data files of any type.
  • Objects – Used to store object properties in XML format
  • Regions – Image formats like BMP, JPEG, PNG, TIFF or GIF.
  • Tables – To store AUT control’s data that display information in a tabular form.
  • WebTesting – Web page controls/items used to compare and verify web pages.
  • XML – To store xml documents.

When you record a scenario and need verification point to test, you will recognize/compare object/properties. These items are in turn added to Stores object by the recorder while saving the scripts. Generally when you use ‘Checkpoints’ in your script, the expected data/files will be stored under stores collection.

Manually also we can add items to the collection thru right clicking the stores item, select ‘New Item/Existing Item’ option and browse for suitable item/file.

a) Scope:

‘Global’ to its project. The stores items are saved inside the project itself by default. We can also alter the location to store in different path. But this makes some dependability while moving the projects. Both the script units and keyword tests has access to this collection.

b) Easy Accessible:

The stores collection is accessible using object called ‘Stores’ inside ‘ScriptExtensions’ and using objects – DBTables, Files, Objects, Regions, Tables, WebTesting, XML inside scripts. Through these object we can refer its corresponding methods which helps us greatly in manipulation our data during execution.

The objects – Files, Objects, Regions and XML has general methods to add files, compare them, check for existence, get them by name/index and delete them. The remaining objects – DBTables, Tables, WebTesting have methods based on the instances in them.

These objects/files are independent of script languages and TestComplete supports all the methods and properties to access these files in any of its five scripting languages supported.

c) Adapting to changes:

The items/collections present in the project can be added/ updated/ deleted any time by with fewer mouse click actions. TestComplete provides easy wizard like forms to add/update the items and well informed UI to view the data in the files. For example let us see how the data in XML file is shown in TestComplete.

XMLViewer through which XML files can be viewed and edited

XMLViewer_in_Stores

Just like the screen shot it displays our files (expected, checkpoints) in well understanding manner and provides options to update the files. You can simply switch over to the ‘Source’ tab here to edit the XML file and save to update it. Similarly the many of the ‘Stores’ files can be easily updated right inside test complete.

One more advantage in using Stores is we can update the stores object/files in one click. If the AUT has gone for any change and the expected results/checkpoints need to be updated to the latest version of the product, simply we can ask TestComplete to update the object instead of checking/comparing it. So when the tests run, TestComplete will update its stored objects based on the mapping in the Stores. Use the Tools -> Options dialog to configure the compare/update setup as below in the figure.

Navigate to Tools-> Options-> Engines-> Stores dialog to update Stores collection

Update_Stores_Collections

To be continued..

Hope there are 101 ways to Automate !!!
Thanks & Regards,
Giri Prasad

Data handling support in TestComplete – Part 1


Hi All,

In testing/automation we deal with different types of ‘Data’ say Test data, Expected data, Configuration data and execution data, etc,. The dependability on test data changes based on the framework chosen for automating the AUT. Data Driven Testing has more dependability with the Test data and Expected data than other approaches. Overall we need a good mechanism to store, configure, and use them in to the scripts to effective automation.

So when I ask data handling for Automation, I might get answers from 2 categories.

  1. Using scripting language
  2. Using Automation tool

Since most of the script languages have a programmatic approach (Initialize, Use and Destroy) in handling the data and mostly they are not persistent across executions, we will move and concentrate to second option and get clarify on the support provided from tool perspective.

We must understand that data handling plays a key role in determining the success of a test automation implementation before moving to handle it. A complete analysis of our data handing requirement must be done before choosing an approach.

TestComplete supports DDT and provides various options to store data through it. Some of them are,

  1. ProjectSuite/Project variables
  2. Stores
  3. Using Data Storages
  4. Storages Object

I am going to consider the following criteria for each facility:

a.      Scope
b.      Easy accessible
c.      Adapting to change

1. Project Suite / Project variables:

The project suite and project items created using the TestComplete was designed to store data in them apart from project related properties. To use it, we have a choice of few available data types (String, Integer, Double, Boolean, Object, Table) and give a meaningful variable name to it. From the scope perspective, these Project suite/Project variables can be classified as ‘Persistent’ and ‘Temporary’.

Simply, Persistent variables hold their value between the test executions whereas Temporary variables will not. And Persistent variables have some advantage over distributed automation where we can have separate value for each remote host thru ‘Local value’. Also when your script is accessing and updating both persistent and temporary variables, in the next execution persistent variable will have the updated one whereas the temporary will hold its default value.

The properties of these variables are,

Name – Name of the variable (should follow basic naming convention).

Type – Any one data type to choose from – [String, Integer, Double, Boolean]. The temporary variables can also be of type Object and Table

Default value – The default value if any for the variable.

Local value – This is available only for persistent variables.  Local value to be used for remote host on distributed execution

Description – Summary to explain the variable usage

Scope:

Global to the declared item. As per the name says the project suite variables can be accessed across the project inside the suite and project variables are accessible across the scripts inside the project

Adapting to change:

Yes. We can modify these variables anytime during the execution inside the scripts and always available to change. Mostly these can be used where we need to same set of information across script unit and where value set by one script need to be accessed by another.

Easy accessible:

Syntax:

Project.Variables.<variable name>

ProjectSuite.Variables.<variable name>

As mentioned above can be accessed anywhere to ‘set’ or to ‘get’ the values. The TestComplete will give you code snippets for the defined variables.

Using Stores and data sources will be continued in next post…

Hope there are 101 ways to Automate !!!
Thanks & Regards,
Giri Prasad

Data-Driven Testing with TestComplete – Part 2


In this continued post we will discuss the data driven approach in detail with examples.
DDT in TestComplete diagram

Test data access mechanism in TestComplete for DDT

This diagram explains how DDT is implemented in TestComplete. Looking at it you can understand how DDT object acts as an interface between the Data Source and our scripts.

The DDT object is implemented as an extension (Using COM mechanism) in TestComplete. In order to use it we need to make sure it is installed in our TestComplete. This extension is installed by default. You can verify it in [ File -> Install Extension ].

The DDT object has separate drivers for data sources like Excel, CSV and ADO tables. The main advantage of having these drivers is we can use same code ie., access properties and methods of DDT object independent of data source. Based on the data source use the driver from DDT object.

For example,

function TestDDTDriver()
{
//Connect to ADO data source through windows authentication
var objDB = DDT.ADODriver(“*ADO table connection string*”);
ReadData(objDB);
DDT.CloseDriver(objDB.Name);

//Connect to CSV data source
objDB = DDT.CSVDriver(“CSV File path”);
ReadData(objDB);
DDT.CloseDriver(objDB.Name);

//Connect to excel data source
objDB = DDT.ExcelDriver(“Excel File path”,”Sheet-Name”);
ReadData(objDB);
DDT.CloseDriver(objDB.Name);
}

function ReadData(objDB)
{
while(! objDB.EOF())
{ var strRecord  = ""
for(var idx=0; idx <strong><</strong> objDB.ColumnCount; idx++)
{
strRecord += objDB.Value(idx) + " --- ";
}
Log.Message(strRecord);
objDB.Next();
}
}</pre>
<pre>

I am using a general “ReadData” function to read various kind of data source just by connecting through different drivers.

Thus the complexity of manipulating different data sources is abstracted by DDT object and gives a clean and easy way to access them.

The key point behind this idea was, we can imagine the data source as rows and columns as similar to DB tables and neglect how Excel or CSV actually stores the data.

So how exactly this object gets the schema of each data source…

The assumptions are,

  • The object assumes that each row in your CSV/EXCEL file contains data for one test run. Files that use multiple rows for one run are not supported.
  • The column names are specified by the first line of the file.
  • To retrieve data from the file, the object uses Microsoft Jet Engine.
  • While connecting to Excel files we have option to switch to ACE drivers. (Supported for Microsoft Office 2007 and later).

Also we can override the default comma (“,”) as delimiter in CSV files. The format of the CSV file is determined by using a schema information file, named Schema.ini, and located in the same folder as the CSV file. Schema.ini can keep information about several files and for each file it provides data for the general format of the file, field names and field types, used character set, delimiter character, and a number of other data characteristics. For example:


; The contents of the Schema.ini

[MyData.csv]

Format=Delimited(#)

CharacterSet=ANSI

Now the driver will read the CSV file “MyData.csv” with delimiter “#”.

Another best practice I would like to mention is, always close the opened connections with data sources. (Even though Jet Engine has a limitation of 64 connections per process).
Hope there are 101 ways to automate…
Thanks & Regards,
Giri Prasad

Data-Driven Testing with TestComplete


Hi All,

To start with we should first understand what is DDT?

  • A scripting technique that stores test input and expected results in a table or spreadsheet, so that a single control script can execute all of the tests in the table. Data driven testing is often used to support the application of test execution tools such as capture/playback tools. www.infodiv.unimelb.edu.au/knowledgebase/itservices/a-z/d.html

Where DDT is more associated – Manual testing or Automation testing?

DDT is done where the functionality need to be tested for a range of data. Here the flow won’t change and only the data required will be changed for each execution. My understanding is we can achieve good results in DDT if implemented in automation than manual testing.

So when one thinks on implementing DDT through automation he/she has to decide on many factors.

  1. How the data is to be maintained?
  2. How the scripts are to be organized?
  3. How the driver script is to be interfaced with data and actual test script?
  4. Is the implementation feasible/extendable?

When one has answers for the above questions he has a DDT framework too.

The success of DDT Framework relies on organizing test data, Scripts (Driver script+Test script) and implementing with scalability and flexibility.

Let us see how TestComplete supports DDT.

TestComplete has a good support to organize test data, has in-build driver object (DDT driver) to connect with data store and support from TestLog to report the execution results.

Data storage support :

TestComplete supports test data stored in usual data stores such as Excel, CSV and ADO database tables. In addition to this, we can store our test data in TestComplete’s project suite variables or project variables. Most importantly these project variables support data to be stored on a table like structure (2-D array).

Script storage support :

Scripts can be organized inside TestComplete with

  • ProjectSuite :- Holds one or more TestComplete projects
  • Project :- Collection of test items – scripts, stores, object repository and tested application storage
  • Folders – Logical grouping for scripts and files inside a project

Has option to reuse the scripts across projects.

Driver support:

Once the scripts and data are placed in the corresponding location we need to interface it through data drivers. To simplify access to data stored in Excel sheets, ADO databases or CSV files, TestComplete includes special driver (DDT Driver) that hide the actual data storage format and let you access data using a unified table-like interface.

In the next post we will discuss in detail what to be  done to achieve DDT  through TestComplete

Hope there are 101 ways to automate…
Thanks & Regards
Giri Prasad