乐趣区

关于openresty:openresty微信公众平台开发

我的项目源码

https://github.com/helloJiu/o…

openresty 源码装置 (ubuntu 为例)

apt install gcc libpcre3-dev libssl-dev perl make build-essential zlib1g-dev
wget https://openresty.org/download/openresty-1.19.9.1.tar.gz
tar -zxvf openresty-1.19.9.1.tar.gz
cd openresty-1.19.9.1/
./configure
make && make install

装置 luarocks

wget https://luarocks.github.io/luarocks/releases/luarocks-2.4.3.tar.gz
tar -xzvf luarocks-2.4.3.tar.gz
cd luarocks-2.4.3/
./configure --prefix=/usr/local/openresty/luajit     --with-lua=/usr/local/openresty/luajit/     --lua-suffix=jit     --with-lua-include=/usr/local/openresty/luajit/include/luajit-2.1
make && make install

配置环境变量

vim /etc/profile
export PATH=$PATH:/usr/local/openresty/bin:/usr/local/openresty/luajit/bin
source /etc/profile

# 设置 lua 软链到 luajit
ln -s  /usr/local/openresty/luajit/bin/luajit lua
mv lua /usr/bin/

装置 lapis

  • https://leafo.net/lapis/
/usr/local/openresty/luajit/bin/luarocks install lapis

装置 redis 依赖包和 http-client 依赖包以及其余依赖

opm install lua-resty-string
opm install openresty/lua-resty-redis
opm install ledgetech/lua-resty-http

微信公众平台筹备

测试号申请

https://mp.weixin.qq.com/debu…

内网穿透工具

首页

cpolar.exe http 8123

配置

# 测试号信息
appID xxx
appsecret xxx
#接口配置信息批改
内网穿透失去的地址 如 https://444aece.r6.cpolar.top/wechat/accept
# 验证 Token  对应配置里的 wechat.verifyToken
helloworld

配置 app/config/config.lua

    -- 微信相干配置
    wechat = {
        appId = "xxx",  -- 公众号 id
        appSecret = "xxx", -- 公众号秘钥
        verifyToken = "helloworld", -- 验证 Token 
    },
    -- redis 相干配置
    redis = {
        host = "127.0.0.1",
        port = 6379,
        password = "",
        db_index = 0,
        max_idle_time = 30000,
        database = 0,
        pool_size = 100,
        timeout = 5000,
    },

启动我的项目

lapis server

压力测试

## autocannon 压测命令须要应用 npm 装置
autocannon -c 100 -d 30 -p 2 -t 2 http://127.0.0.1:8123/wechat/checkLogin?scene=NHAK5ElJqz73YHaYhltG

## 运行后果
Running 30s test @ http://10.254.39.195:8123/wechat/checkLogin?scene=NHAK5ElJqz73YHaYhltG
100 connections with 2 pipelining factor


┌─────────┬───────┬────────┬────────┬────────┬───────────┬───────────┬─────────┐
│ Stat    │ 2.5%  │ 50%    │ 97.5%  │ 99%    │ Avg       │ Stdev     │ Max     │
├─────────┼───────┼────────┼────────┼────────┼───────────┼───────────┼─────────┤
│ Latency │ 12 ms │ 314 ms │ 652 ms │ 701 ms │ 316.26 ms │ 186.86 ms │ 3094 ms │
└─────────┴───────┴────────┴────────┴────────┴───────────┴───────────┴─────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg     │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤
│ Req/Sec   │ 7259    │ 7259    │ 8807    │ 9207    │ 8714.94 │ 436.3   │ 7258    │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤
│ Bytes/Sec │ 1.58 MB │ 1.58 MB │ 1.92 MB │ 2.01 MB │ 1.9 MB  │ 95.1 kB │ 1.58 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 30

267k requests in 30.03s, 57 MB read
55 errors (0 timeouts)

## QPS 大略 8700+
退出移动版