Over 8 years of experience in automated and manual testing of



Download 31.78 Kb.
Date29.07.2017
Size31.78 Kb.
#24735


Kumar Jain (777) 7777-7777 xxxxx@gmail.com
SUMMARY



  • Over 8 years of experience in automated and manual testing of web and client/server applications in Windows NT/2000 and Windows XP environments.

  • Extensive and experience in Creation and Maintenance of Test Strategies and Test Planning.

  • Experience in both manual and automation testing of applications.

  • Experience in user requirement analysis, writing test plans, execution of test cases to verify application functionality against business requirements, manually and by using Mercury Testing Tools (Win Runner, QTP, Load Runner, Quality Center, Test Director) and Rational Tools (Requisite Pro, Test Manager, Clear Quest).

  • Extensive hands on experience in UAT, Data Testing and Performance Testing.

  • Experienced in Design and Execution of Test criteria, Scenarios, and Scripts from requirements.

  • Experience in Black Box testing with a complete Quality Assurance cycle-from testing, defect logging, and verification of fixed bugs.

  • Extensive working experience in Functionality Testing, Regression Testing, Integration Testing, Stress Testing and Load Testing.

  • Involved in Customer service testing in many application.

  • Experienced in all phases of the Software Testing Life Cycle (STLC) and good exposure to Software Development Life Cycle (SDLC).

  • Performed Database Verification & Validation testing using Oracle, MS SQL Server databases.

  • Good Interaction with developers, managers and team members.

TECHNICAL SKILLS

Languages : C, C++, JAVA, J2EE

Operating Systems : Windows XP

Web Technologies : HTML, CSS, VB Script

J2EE Technologies : Java Servlets 2.4, JSP 2.1, JDBC

Frameworks : Struts1.1, Hibernate3.0

Testing Tools : Quality Center, Test Director, TFS, WinRunner 7.0, Quick Test Pro, Test Complete, LoadRunner 8.1, IBM Rational RequistePro, Rational Clear Quest, Rational Clear case.

Web/ App Servers : BEA Weblogic 8.1/9.1, Apache Tomcat 5

IDE : Eclipse 3.1, My Eclipse Enterprise Workbench 5.1/6.5

Database : Oracle 10g/9i/8i, SQL, PLSQL


PROFESSIONAL EXPERIENCE

Client: Apple Inc, Cupertino, CA Oct 2008 to Current

Project: Claims project

Role: Senior QA Analyst
Description:

Apple Inc is a multinational corporation that designs and markets consumer electronics, computer software, and personal computers. The project is called MobileMe. MobileMe is a subscription service that includes a suite of Internet applications that you can use to create, manage, or share your information wherever you are. MobileMe automatically pushes new email, contacts, and calendar events to your Mac or PC and over the air to your iPhone, iPad, iMac and iPod touch. At me.com, you can check your email, manage your contacts and calendar, share photos, and store documents.


 

Responsibilities:
• Managing multiple tasks through the Testing life cycle, including Test Strategy and Test Plan, Test Cases.

• Involved in system requirements reviews and technical reviews & testing implementations.

• Created back end Test cases for database and logs

• Thorough knowledge in system apps like iCal, Address Book, iPhoto, iWeb and iMovie by testing .Mac and MobileMe publishing

• Performed regression testing, functional testing, GUI testing for Web gallery and MobileMe suite

• Identifying and tracking the defects using Apple’s defect tracking system

• Using Firebug tool and Httpfox to trace the root cause of the issues.

• Writing effective bugs by providing trace using UNIX for the hardly reproducible bugs.

• Performed browser compatibility testing on Mac OS X (10.4 and later) and PC (Windows XP & Vista).

• Involved in testing recurring events, custom recurring events with different time zones.

• Tested Marketing and Error pages on Mac Pc and iPhone in all supported languages and browsers.

• Tested storage allocation in the Accounts page of MobileMe

• Provide regular project status updates as requested.

• Participated in weekly status meetings to report issues. Communicated with developers through all phases of testing

• Performed iPhone Calendar Testing

• Involved with teams from other systems in setting up the test environment and releases

• Performed System, Integration, Interface, Web, GUI, Network, Security, Ad-hoc, User Acceptance and

Regression testing

• Performed end-to-end testing for both data transformation as well as functionality testing

• Responsible for supporting all data and environment issues associated with test execution.

• Problem reporting and management, including timely and accurate communications to the impacted teams

• Involved in testing in Agile environment for short sprints where the emphasis was more on hands on testing and daily touchpoint meetings

• Involved in testing of XML files

• Involved in Smoke and Sanity Testing

• Involved in testing Regression Testing for Builds

• Involved in Database testing by using TOAD and SQL.

• Involved in testing with Scrum Framework

• Involved in testing of XML files

• Involved in Regression Testing during Builds and Releases

• Involved in Browser Interoperability testing for various versions of Internet Explorer, Safari, Firefox and Chrome



• Involved in web based testing issues like cache, session, cookies and login issues
Environment: Mac OS X, Windows XP & Vista, IE, Safari, MS Office, ATS

Client: State Farm Insurance, Bloomington, IL Nov 2006 to Sept 2008

Project: Online Auto Insurance

Role: QA Analyst
Description: The project involves developing online services for Auto insurance like get a quote, buy insurance, make a payment, manage policies and claim a report. Users can get a quote online and buy the insurance online and is based on the MVC Architecture.
Responsibilities:

  • Reviewed project business specifications, reviewed project technical specifications.

  • Developed the test plans and test procedures used for manual and automated testing.

  • Thoroughly conducted reviews of test design document prepared by QA team.

  • Conducted business requirements analysis for multi phase web based and client / server projects to build and deploy service and operations workflows on to an integrated workstation.

  • Developed requirement tractability matrix for each project assigned to and conducted test readiness sessions with business analysts to ensure all requirements are mapped.

  • Develop QA standards and methodologies to be followed for testing.

  • Develop all the necessary documents like test strategy, test plan, audit report etc.

  • Wrote test cases for each project assigned in Quality Center based on the use cases and independently worked with the business analysts as well as the developers where the use case was not clear regarding a certain requirement.

  • Performed regression testing, integration testing, System Testing and User Acceptance Testing.

  • Performed Performance Testing of the application.

  • Performed End –To-End testing and Customer Service Testing.

  • Writing automated test scripts using QTP

  • Wrote SQL queries for checking the data transactions and database integrity to perform

Environment:_Windows_2003_Server,_QTP,_Java,_J2EE,_JavaScript,_XML,_XSD,_SQL_Server_2005.__Client:_Vervelife_Solutions,_Chicago,_IL_Aug_2006_to_Nov_2006'>Data Testing.
Environment: Windows 2003 Server, QTP, Java, J2EE, JavaScript, XML, XSD, SQL Server 2005.

Client: Vervelife Solutions, Chicago, IL Aug 2006 to Nov 2006

Project: Photo.com

Role: QA Analyst
Description: Vervelife is a digital media promotions company. This project was designed to develop digital media featues for an online photo community. This involved an e-commerce shop (including photo, camera accessiories and many more) where you could customize products as well as network within the online photo community. Users are also able to upload, enhance, and share photos.
Responsibilities:

  • Reviewing test strategy and writing test plan.

  • Writing test scenarios and writing test cases.

  • Developed VB Scripts.

  • Performing functional and regression tests.

  • Reporting test defects.

  • Analyzed the test results and generated reports in Mercury Quality Centre.

  • Performed testing after the defects were fixed to validate the functionality of the application.

  • Automating test cases using QTP.

  • Creating test plan, walk through, integration approach and strategy document, test cases, scenarios, conditions and scripts, generated system test scripts.

  • Developing automated test scripts with QTP using data driven technique.

  • Executing smoke tests after a build is deployed in test region.


Environment: VB.Net , Microsoft Visual Studio .NET 2003, Windows 2003 Server, XML, SQL Server 2005 , Flash, QTP , Mercury Quality Center .

Client: Flagstar Bank, Michigan June 2005 to July 2006

Project: Fund Management Application

Role: QA Analyst
Description: The Fund Management Application is used by Flagstar to manage information about investment products for the following lines of business – Corporate, Government, College Savings and Offshore. The Fund Management Analysts (FMA) creates new products or modifies existing products for a specified line of business. The system allows the user to create funds and associate the funds to existing products.
Responsibilities:

  • Involved in review and analysis of requirements and design documentation.

  • Involved in preparing functional test plans for different modules of the application.

  • Analyzed use cases and specifications to identify the test cases.

  • Involved in designing and developing the test plan.

  • Analyzed the business and system requirement documents and responsible for developing detailed test cases.

  • Actively participated in creating requirements traceability matrices, and test plans.

  • Tested the functionality of the application with multiple data (positive as well as negative), multiple times using data driver wizard.

  • Performed database testing, regression testing, system testing, and user acceptance testing.

  • Designed scenarios to test the application functionality and ensure the data integrity.

  • Analyzed the test results and generated reports in Mercury TestDirector 8.0.

  • Performed testing after the defects were fixed to validate the functionality of the application.


Environment: VB.Net , Microsoft Visual Studio .NET 2003, UML 2.0, IIS 5.0, Windows 2003 Server, XML, XSD, Crystal Reports 8.5, .NET Framework, SQL Server 2005.

Client: Citibank, Chicago, IL Feb 2004 to April 2005

Project: Account Information System

Role: QA Analyst
Description: Account Information System is a banking system where customer can perform operations for all their account types. Customer by selecting each account can pay bills for respective account and one account to other account. Funds can be transfer to one account to another customer’s account by registering the other customer’s account with one time OTP (One Time Password) by adding to Payees list.
Responsibilities:

  • Analyzed specifications and test plans for the testing process of insurance applications.

  • Developed test cases after analyzing the specifications document.

  • Developed base line scripts for testing the future releases of the application using WinRunner.

  • Developed test scripts for performance and data driven tests using WinRunner.

  • Writing test cases to manually test the performance of the application while incorporating in the automated tool WinRunner.

  • Conducted functionality, performance, and regression testing during the various phases of the application using WinRunner.

  • Executed the test scripts using WinRunner and analyzed the results.

  • Developed and executed automated test case scripts using WinRunner.

  • This testing was done for the front end applications.

  • Participated in client/server & web testing using WinRunner.


Environment: WinRunner, Test Director, Java, J2EE, Java Script, Oracle, Windows 2000/XP, SQL, PLSQL.

Client: Solomon Consultants LLC, Allen, Texas July 2002 to Jan 2004

Project: Flight Simulation software

Role: Software Test Engineer
Description: The Flight Simulation software consists of components like Real Time Executive which executes the Flight mathematical models, Instructor Station to manage and view the training session and Output Panel (Head Up Display) to view the response of the simulated aircraft in terms of deflection in the output flight instruments. Data communication between these components is through windows sockets. Thus helping the Trainer-Pilot to visualize, manage and analyze the flying training session.
Responsibilities:

  • Involved in preparing test cases based on the functional specifications / business requirements.

  • Performed Manual as well as Automated testing.

  • Conducted Data Base and Regression testing using QTP.

  • Conducted Back-end testing using Oracle9i and prepared reports by developing and executing SQL queries.

  • Delivered functional Design documents to Business Analysts.

  • Developed Positive & Negative scenarios for the requirements.

  • Developed test plan, test cases and test scripts in Test Director, after the requirements are signed off.

  • Reviewed and Analyzed business requirement documents, technical requirements and functional specification of various functionalities.

  • Created detailed test plan and test cases from the Business Requirements.

  • Created documents detailing the process for each of the test scripts and the scenarios each script performs.

  • Developed SQL Scripts and tested Oracle Stored Procedures.

  • Performed Regression testing for the various modules of the application.

  • Tested each application against the QA Test Plan and communicate findings with the team.

  • Involved in User Acceptance Testing with users along with providing training to end-users.


Environment: QTP, Windows 2003 Server, Microsoft Visual Studio, Java, J2EE, SQL Server 2005.


Education
Master of Science (Computer Engineering) from University of Bridgeport, CT

Download 31.78 Kb.

Share with your friends:




The database is protected by copyright ©ininet.org 2024
send message

    Main page