7+ years of extensive experience in the areas of Software Testing andSoftware Quality Assurance. Proficient in the Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
Expertise in testing Distributed Web-Based Client-Server applications and XML.
Experience in Analyzing requirements, designing, executing manual and automated test cases, test conditions.
Highly Proficient in different types of testing like Functional ,Regression Testing, System Testing, Integration Testing, User Acceptance Testing, Performance Testing, Smoke & Sanity Testing, Exploratory Testing.
Experience in creating extensive SQL queries to extract data from databases for verification.
Experience with functional automation tool such as WinRunner, QuickTest Professional.
Experience in testing tools such as (Mercury, Rational, Compuware)
Experience in performing GUI, interface, backend and batch process testing in an application
Excellent understanding of industry standard testingmethodologies, processes, and procedures.
Excellent skills in Designing validation/test plans and test strategies.
Experience in Defect reporting and tracking using Mercury Quality Center,Test director.
Expertise in Test Planning, Test Cases Design, Test Environment Setup, Test Data Setup, Defect Management, Configuration Management, Test Metrics.
Excellent skills in time management and multitasking and is able to meet deadlines and handle changing priorities.
Experience with Quick Test Professional (QTP) automated tool.
Experience working with Win Runner automated tool,HP Quality Center.
Experience with Automated test methodologies and Automation Frameworks.
Experience in Web/Internet testing, and database testing.
Expertise using Load Runner in Load, Stress, Performance testing of Web Applications
Education: MS Computer Engineering from Schiller International University, Dunedin, FL
Professional Experience: Client: GE Consumer Finance, Shelton, CT Aug’08-till dateRole: Sr. QA Analyst Description:
The project was called Apollo Workstation. GECF expectation from Apollo Workstation is that it will provide a single consolidated front-end interface for Customer Service Representatives (CSRs) and call-center Collectors seamlessly integrating business processes in different domains leveraging the existing back-end systems / databases. The model of ‘Apollo Work station’ is integration of existing business processes and IT solutions into single-window operation in the following areas of operation:
Involved in designing and documenting Test Plans, Test Cases, Test Scenarios, and Test Strategies based on business requirements and other specifications using Quality Center.
Involved developing and executing formal test plans to ensure the delivery of quality software applications.
Prepared sample data (Test Data) for the historical data testing.
Performed Black Box and manual testing of the application to test the system for both the functional and user requirements for positive and negative scenarios.
Conducted walkthroughs and reviews of the documentation prepared for testing.
Inserted different checkpoints in the Quick Test Pro scripts in order for the GUI and functional testing of the application to be more detailed and exhaustive.
Involved in identifying areas for improvements, implementing process improvements and ensuring long term compliance.
Automating test cases for Regression Testing using Quick Test Pro
Review system use cases and functional specifications with the appropriate business analyst
Involved in writing SQL queries on data staging tables and data warehouse tables to validate the data results.
Documented and reported the progress to the management on an ongoing basis and updated the requirement and defect status as per the current status of the testing project in the Quality Center.
Test Planning, Test Cases Execution, Defect Tracking and Management using Quality Center
Performed the Quality Center Administration by creating users and groups and controlling access permissions.
Worked towards setting up the System Testing andUser Acceptance Testing test environments by providing access to perform testing. Co-ordinated with the Infrastructure Team to manage the test environments.
Involved in UAT of the entire application for all the subsystems.
Involved in Installation, Configuration and Administration of Quality Center.
Hosted the weekly Defect Review Calls and tracked the defects to closure.
Prepared the Weekly Status Reports to be submitted to upper management.
Generated reports and graphs from Quality Center.
Environment: Quick Test Pro (QTP), .Net, PL/SQL, SQL, QC, MS-Word, MS-Excel, Quality Center, Oracle, SQL Server
Client: AT&T, Piscataway, NJ May’07-Jul’08
Role: Sr. QA Analyst Description: AT&T is a communication services company providing local and long distance voice and data services to the customers all over the world. AT&T wireless system is a web-based application, which allows customers to register & create an account online and achieve different business functions like make a payment, check minutes and change plans, pay bills etc. The customer information is protected in an encrypted form by using secure socket layer software.
Analyzed the Functional Design documentation, User Requirements and Technical Requirements.
Written test Scripts using Quick Test Professional analyzing the possible Scenarios in Windows and Web Environments.
Generated QTP Scripts with Standard, Text, Bitmap, Table Checkpoints and also Synchronization Points
Multiple & Reusable Actions were Created and Executed in the Script
Implemented the automated testing effort using Quick Test Pro. Controlled the entire testing effort through the use of Quality Center.
Load Testing using Load Runner with VUGen scripts and running them using the controller. Defined threshold limits and monitored server load.Scripts were parameterized using the Data Table parameters, Environment Variables and Random Number Parameters.
Output values were Retrieved and passed as parameters to the other Actions.
Performed Regression Testing using Quick Test Professional
Involved in the verification of GUI, Bitmap, Text, Database checkpoints and synchronization points.
Performed GUI, Functionality, Security, Integration, and System testing of the application and investigated software bugs.
Automated scripts were run non-stop and the defects were reported accordingly.
Executed the QTP Scripts and passed Messages to the Log using the Report feature.
Extensively used SQL to verify the integrity and consistency of the data in the database.
Scheduled Test Procedure Design Reviews in accordance with the procedures and regulations.
Prepared and executed Test Cases with different Test Sets for different objectives based on the Business/Functional requirements
Used Quality Center for test planning, executing test cases and reporting defects.
Used all functionalities of Quality Center to link requirements to test cases, to define relationship between parent and child test cases, to track defects and generate different tests reports for senior management.
Test Planning and bug tracking was done using Quality Center
Analyzed end user performance results and determined which steps need performance optimization, required installation of diagnostics and working with Performance team.
Made Performance optimizations and capacity planning recommendations.
Work closely with the QA manager to plan, schedule, and execute the QA strategy.
Environment: Load Runner, Quick Test Professional (QTP), Quality Center, Windows NT, HTML, XML, Win Runner, J2EE, Apache, Oracle, Web Logic.
Client: Ameritrade INC, NE Feb’06 – Apr’07
Role: .Quality Assurance Tester Description: Ameritrade Technology Group (ATG) is a pioneer in the discount and online brokerage industry. They implemented new functionalities to web like Portfolio management, E-mail confirmation, Streaming quotes, Real time balances and positions, Mutual Funds, On money integration, decimalization, AM Extended hours trading, Exchange Agreements, Wireless conversion.
Understood the Functional and System Requirement documents and Design documents.
Designed Use Cases and developed the corresponding Test Cases.
Responsible for extensive testing of different modules of this Web-based application.
Involved in complete Testing Life Cycle for the various modules of this application.
Performed Black Box/Functional Testing, Regression Testing and Performance Testing.
Pre-testing phase involved understanding/analyzing project, vision, goals, specifications and requirements.
Developed test cases and test scripts collaboratively with other testers to test the functional requirements.
Initially performed Manual Testing on application for Functionality testing and performed Regression Testing.
Attended Change Request meetings and made subsequent changes in test plan and wrote test cases according to change requests.
Parameterized the fixed values in checkpoint statements, created data table for the parameters and wrote functions for the parameters to read new data from the table upon each iteration.
Responsible for keeping up test schedule and interacting with software engineers to ensure clear communication on Requirement and Defect Reports.
Analyzed, reported and kept track of defects using Clear Quest.
Involved in UAT and reported the issues to the Lead.
Provided testing results and weekly status report to QA lead, provided test summary report on the final completion of the week.
Client: Farmers Insurance, IL Oct’04 to Dec’05
Role: QA Analyst/Automation Engineer Description: Farmers Insurance online web enabled integrated system which can be accessed through internet and intranet for Farmers underwriters and by producers for Farmers Insurance lines of business. It handles rating scenarios, reinsurance etc. The different rating scenarios will help compare strategic information such as insurance premium, limits, loss over different policy years etc. It interacts with different applications such as FI Start to obtain Submission and related information and Interface with PPS Common Chassis for Quote/Bind.
Involved in writing the Test Plan based on System Requirements Document.
Developed functionality matrix to ensure completeness of testing. Conducted writing test cases and executed test cases.
Generated Automated Test Scripts using WinRunner for Regression Testing. Created user defined functions to enhance the maintainability of test scripts. Handled dynamically changing web objects using Smart Identification Feature of WinRunner.
Developed test harness using WinRunner that will assist the testers use the commonly used actions and test scenarios and reduce the redundancy of automation scripts
Created reusable actions, used external library files to test the specific test cases for windows applications using WinRunner.
Created Stress Test Plan and identified risk analysis and communicated to team members.
Conducted benchmarking for Stress and Performance testing using Load Runner.
Performed Stress testing and identified issues related to database and application server using Load Runner, and perform monitoring system.
Inserted Rendezvous Points within the Virtual User Script in Load Runner to emulate heavy user load on the server.
Extensive Parameterization of the VUGen scripts to ensure the real time load conditions.
Created Risk Assessment document without standing defect list and related risks.
Client: Airports Authority of India(AAI), India Jun’02 to Aug’04
Role: Test Engineer Description: This is the Web application for Airport Authority of India to maintain Cargo arrived from different countries for Import and Export. The main modules include Imports, Exports, Disposal, and Common. Import sub-module contains general manifest, flight segregation, customs checking, lab tests, gate pass generation, cargo arrival, dispatching and billing system.
Designed the User Interface (UI) using HTML, ASP and VB script.
Written database Constraints and Rules to maintain the data integrity.
Written Triggers, Stored Procedures, and Views & Data Flow Diagrams for the associated transactions.
Involved in Testing and Implementation.
Involved in Manual Testing of the interface and its functionality.
Executed test cases on each build of the application and verified the actual results against requirements.
Designed and developed various Master and Transaction Screens.
Environment: ASP, VB Script, HTML, Oracle, Test Director, Windows.