Performance Evaluation Test 1. Necessity
Common key criteria.
Regulatory requests for performance reports
Involving property, life safety systems
First production of large systems
The core database, hardware and software upgrades
Of users, business volume growth of more than 30%
Single version of the single business to assess the weight
Whether the core of the platform
Whether there is a way to adjust or optimize deployment
Whether to increase a higher risk-adjusted performance
The existence of customer requirements must be tested business processes
Whether involving multiple functional defects of the repair process and a greater change
2. Performance Testing Requirements Analysis
Operational level
Users extensive use of function
Daily accounting for more than 80% of the business
Special trading day or 80% of peak business
Core business processes occur significant adjustments business
Project level
Have tested performance changed the structure of the business
Complex logic, critical business
You may consume a lot of resources of business
Interface calls exist with external systems, a large number of interactive services
Call third-party business components and complex business logic
Performance Testing Requirements Review
Testability
It can be built relatively realistic environment
consistency
User needs, production requirements (authenticity), operational requirements (planning for the future development requirements)
Correctness
3. Performance test case design
Test Modeling
Example : Login operational processes (FIG mind)
Open Home
Enter your user name, password
Exit system
Scene case design
classification
Single Business Benchmark: meets the system design and user expectations of performance indicators
Single business stress tests: at maximum load, long-time continuous service
Single traffic load tests: the system can withstand the maximum load
Integrated Services Stress Test
Integrated Services load testing
Under standard load capacity of core business systems and stable long-running services: integrated business stability
Calculate the number of threads
Use case scenario
Script case design
4. Test data structure
BadBoy create a user registration script
Export to record a script jmx
Jmeter iterative generation account
$ {Username} variable of the CSV
5. Test Script Development
BadBoy recorded landing and purchase scripts
Jmeter Configuration
Add -> Timer -> Fixed Timer: interval provided
Add -> assertion -> assertion response: Check the successful landing
Add -> Listener -> View Results Tree / Aggregate Report
Fiddler's use
If BadBoy not record to add items to the shopping request, you need to manually capture with add Fiddler
Add -> Sample-> HTTP request
6. Scene Design and Implementation
The number of concurrent threads and scheduler configuration
如果是Badboy录制的脚本,循环设置在Step1设置 永远
监听结果
资源监听器gc-perfMon Metrice Collector
下载:
地址https://jmeter-plugins.org/downloads/all/,下载plugins-manager.jar
把给文件放到apache-jmeter/lib/ext目录下
增加插件:
选择,重启
添加监听器:
重启后可以 添加-监听器-@gc-perfMon Metrice Collector
增加CPU、内存等指标后保存
7. 用例执行
环境
注意客户端性能
注意服务器最好能够独占测试
注意时间的选择,测试环境/生产环境最好是少人使用的时候
记录服务器配置
测试服务端配置:
应用服务器-机型-台数-CPU-内存-IP
数据库服务器-机型-台数-CPU-内存-IP
测试客户端配置:
客户端-机型-台数-CPU-内存-IP
运行任务
8.结果分析
响应时间
Apdex
业务成功率(看断言)
测试脚本中设置了断言,判断用户登录后是否出现“登录成功”字样,并设定“断言结果”查看器,通过查看断言结果,全部通过表示业务成功率100%
并发数
CPU与内存
数据库
结果统计
9.性能调优
性能问题表现特征
响应时间平稳但较长
响应时间逐步变长
响应时间随着负载变化而变化
数据积累导致锁定
稳定性差
响应时间长,系统越来越慢,出现业务错误,通常原因
物理内存资源不足;内存泄露;资源争用;外部系统交互;业务失败频繁重启,无终止状态;中间件配置不合理,数据库连接设置不合理;进程/线程设计错误