基于MAX78000的语音贪吃蛇游戏体验
基于深度学习,语音识别用户自定义关键词,来控制游戏中贪吃蛇的移动。
标签
嵌入式系统
Funpack活动
MAX78000
madfish
更新2023-02-03
588

内容介绍

1.项目介绍

       项目基于官方的示例工程Snake Game, 通过语音识别“up,down,left,right...”等关键词,来控制蛇的移动,硬件采用MAX78000FTHR开发板,集成基于硬件的卷积神经网络加速器可以执行 AI 推理在非常低的能量水平,显示屏采用DFROBOT生产的1.54寸LCD显示模块。

2.项目设计思路

    2.1.搭建环境

      模型训练的环境搭建参照官方说明文档,实际搭建过程中,根据硬件环境,以及系统版本会有各种不同问题。

      本项目采用系统为Ubuntu20.04 lts主要搭建步骤如下:

      --安装项目所需的工具包,注意斜杠某些系统环境无法识别斜杠\需要分多次安装

$ sudo apt-get install -y make build-essential libssl-dev zlib1g-dev \
  libbz2-dev libreadline-dev libsqlite3-dev wget curl llvm \
  libncurses5-dev libncursesw5-dev xz-utils tk-dev libffi-dev liblzma-dev \
  libsndfile-dev portaudio19-dev

      --确认系统python版本为Python 3.8.x

      --克隆项目文件

$ git clone --recursive https://github.com/MaximIntegratedAI/ai8x-training.git
$ git clone --recursive https://github.com/MaximIntegratedAI/ai8x-synthesis.git

      --安装项目依赖包

(ai8x-training) $ pip3 install -U pip wheel setuptools
(ai8x-training) $ pip3 install -r requirements-cu11.txt

      2.2 执行训练

./scripts/train_kws20_v3.sh

      2.3.环境搭建过程中遇到的各种问题

ERROR: resampy 0.4.2 has requirement numba>=0.53, but you'll have numba 0.49.1 which is incompatible.
ERROR: tensorboard 2.9.1 has requirement protobuf<3.20,>=3.9.2, but you'll have protobuf 3.20.3 which is incompatible.
ERROR: nbconvert 7.2.5 has requirement jinja2>=3.0, but you'll have jinja2 2.10.1 which is incompatible.
ERROR: nbconvert 7.2.5 has requirement pygments>=2.4.1, but you'll have pygments 2.3.1 which is incompatible.
ERROR: ipython 8.7.0 has requirement pygments>=2.4.0, but you'll have pygments 2.3.1 which is incompatible.

      按照错误提示手动安装各个版本包

$ pip install numba==0.53
$ pip install protobuf==3.9.2
$ pip install jinja2==3.0
$ pip install pygments==2.4.1

    

3.搜集素材的思路

      由于项目目的为识别用户自定义关键字,因此需要采集用户关键字语音信息,为保证模型效果,需要分多次,各种环境背景音下,采集用户关键字,采集到的语音整理为以下格式。

时间为1s,32kB,16-bit小端 PCM编码 WAVE,采样率16k

4.预训练实现过程

     经过了几周的尝试,环境终于搭建好了 ,从最初的wsl2,到腾讯云,阿里云各个环境都试了一遍,同样是一套安装流程,报的错是各种各样的,最终还是回到wsl2上面,好在环境是终于搭建好了,训练也跑起来了,由于用的cpu平台,训练速度有些慢。

FgvC2v_6WWQtXnZB3UDBayhwJw4J

Training epoch: 258930 samples (256 per mini-batch)
Epoch: [19][   10/ 1012]    Overall Loss 0.313984    Objective Loss 0.313984                                        LR 0.001000    Time 0.287783
Epoch: [19][   20/ 1012]    Overall Loss 0.301635    Objective Loss 0.301635                                        LR 0.001000    Time 0.199325
Epoch: [19][   30/ 1012]    Overall Loss 0.294099    Objective Loss 0.294099                                        LR 0.001000    Time 0.169220
Epoch: [19][   40/ 1012]    Overall Loss 0.292085    Objective Loss 0.292085                                        LR 0.001000    Time 0.153846
Epoch: [19][   50/ 1012]    Overall Loss 0.289668    Objective Loss 0.289668                                        LR 0.001000    Time 0.144684
Epoch: [19][   60/ 1012]    Overall Loss 0.286566    Objective Loss 0.286566                                        LR 0.001000    Time 0.138351
Epoch: [19][   70/ 1012]    Overall Loss 0.288767    Objective Loss 0.288767                                        LR 0.001000    Time 0.134098
Epoch: [19][   80/ 1012]    Overall Loss 0.293444    Objective Loss 0.293444                                        LR 0.001000    Time 0.130837
Epoch: [19][   90/ 1012]    Overall Loss 0.295964    Objective Loss 0.295964                                        LR 0.001000    Time 0.128300
Epoch: [19][  100/ 1012]    Overall Loss 0.294188    Objective Loss 0.294188                                        LR 0.001000    Time 0.126260
Epoch: [19][  110/ 1012]    Overall Loss 0.296818    Objective Loss 0.296818                                        LR 0.001000    Time 0.124561
Epoch: [19][  120/ 1012]    Overall Loss 0.294679    Objective Loss 0.294679                                        LR 0.001000    Time 0.123139
Epoch: [19][  130/ 1012]    Overall Loss 0.295100    Objective Loss 0.295100                                        LR 0.001000    Time 0.121976
Epoch: [19][  140/ 1012]    Overall Loss 0.295078    Objective Loss 0.295078                                        LR 0.001000    Time 0.121083
Epoch: [19][  150/ 1012]    Overall Loss 0.293299    Objective Loss 0.293299                                        LR 0.001000    Time 0.120260
Epoch: [19][  160/ 1012]    Overall Loss 0.293785    Objective Loss 0.293785                                        LR 0.001000    Time 0.119530
Epoch: [19][  170/ 1012]    Overall Loss 0.293703    Objective Loss 0.293703                                        LR 0.001000    Time 0.118871
Epoch: [19][  180/ 1012]    Overall Loss 0.292900    Objective Loss 0.292900                                        LR 0.001000    Time 0.118241
Epoch: [19][  190/ 1012]    Overall Loss 0.292962    Objective Loss 0.292962                                        LR 0.001000    Time 0.117733
Epoch: [19][  200/ 1012]    Overall Loss 0.293803    Objective Loss 0.293803                                        LR 0.001000    Time 0.117261
Epoch: [19][  210/ 1012]    Overall Loss 0.294633    Objective Loss 0.294633                                        LR 0.001000    Time 0.116829
Epoch: [19][  220/ 1012]    Overall Loss 0.294978    Objective Loss 0.294978                                        LR 0.001000    Time 0.116448
Epoch: [19][  230/ 1012]    Overall Loss 0.294477    Objective Loss 0.294477                                        LR 0.001000    Time 0.116269
Epoch: [19][  240/ 1012]    Overall Loss 0.293564    Objective Loss 0.293564                                        LR 0.001000    Time 0.116011
Epoch: [19][  250/ 1012]    Overall Loss 0.292999    Objective Loss 0.292999                                        LR 0.001000    Time 0.115683
Epoch: [19][  260/ 1012]    Overall Loss 0.293600    Objective Loss 0.293600                                        LR 0.001000    Time 0.115365
Epoch: [19][  270/ 1012]    Overall Loss 0.293604    Objective Loss 0.293604                                        LR 0.001000    Time 0.115125
Epoch: [19][  280/ 1012]    Overall Loss 0.295085    Objective Loss 0.295085                                        LR 0.001000    Time 0.114850
Epoch: [19][  290/ 1012]    Overall Loss 0.296082    Objective Loss 0.296082                                        LR 0.001000    Time 0.114626
Epoch: [19][  300/ 1012]    Overall Loss 0.296814    Objective Loss 0.296814                                        LR 0.001000    Time 0.114418
Epoch: [19][  310/ 1012]    Overall Loss 0.297363    Objective Loss 0.297363                                        LR 0.001000    Time 0.114225
Epoch: [19][  320/ 1012]    Overall Loss 0.297183    Objective Loss 0.297183                                        LR 0.001000    Time 0.114022
Epoch: [19][  330/ 1012]    Overall Loss 0.297062    Objective Loss 0.297062                                        LR 0.001000    Time 0.113859
Epoch: [19][  340/ 1012]    Overall Loss 0.297388    Objective Loss 0.297388                                        LR 0.001000    Time 0.113696
Epoch: [19][  350/ 1012]    Overall Loss 0.296915    Objective Loss 0.296915                                        LR 0.001000    Time 0.113558
Epoch: [19][  360/ 1012]    Overall Loss 0.296891    Objective Loss 0.296891                                        LR 0.001000    Time 0.113428
Epoch: [19][  370/ 1012]    Overall Loss 0.296052    Objective Loss 0.296052                                        LR 0.001000    Time 0.113292
Epoch: [19][  380/ 1012]    Overall Loss 0.294866    Objective Loss 0.294866                                        LR 0.001000    Time 0.113174
Epoch: [19][  390/ 1012]    Overall Loss 0.294852    Objective Loss 0.294852                                        LR 0.001000    Time 0.113037
Epoch: [19][  400/ 1012]    Overall Loss 0.294815    Objective Loss 0.294815                                        LR 0.001000    Time 0.112923
Epoch: [19][  410/ 1012]    Overall Loss 0.295323    Objective Loss 0.295323                                        LR 0.001000    Time 0.112826
Epoch: [19][  420/ 1012]    Overall Loss 0.294929    Objective Loss 0.294929                                        LR 0.001000    Time 0.112732
Epoch: [19][  430/ 1012]    Overall Loss 0.295521    Objective Loss 0.295521                                        LR 0.001000    Time 0.112623
Epoch: [19][  440/ 1012]    Overall Loss 0.295394    Objective Loss 0.295394                                        LR 0.001000    Time 0.112523
Epoch: [19][  450/ 1012]    Overall Loss 0.295120    Objective Loss 0.295120                                        LR 0.001000    Time 0.112425
Epoch: [19][  460/ 1012]    Overall Loss 0.295059    Objective Loss 0.295059                                        LR 0.001000    Time 0.112304
Epoch: [19][  470/ 1012]    Overall Loss 0.295690    Objective Loss 0.295690                                        LR 0.001000    Time 0.112206
Epoch: [19][  480/ 1012]    Overall Loss 0.295448    Objective Loss 0.295448                                        LR 0.001000    Time 0.112127
Epoch: [19][  490/ 1012]    Overall Loss 0.295219    Objective Loss 0.295219                                        LR 0.001000    Time 0.112067
Epoch: [19][  500/ 1012]    Overall Loss 0.294877    Objective Loss 0.294877                                        LR 0.001000    Time 0.111992
Epoch: [19][  510/ 1012]    Overall Loss 0.294220    Objective Loss 0.294220                                        LR 0.001000    Time 0.111924
Epoch: [19][  520/ 1012]    Overall Loss 0.293924    Objective Loss 0.293924                                        LR 0.001000    Time 0.111864
Epoch: [19][  530/ 1012]    Overall Loss 0.294121    Objective Loss 0.294121                                        LR 0.001000    Time 0.111797
Epoch: [19][  540/ 1012]    Overall Loss 0.294086    Objective Loss 0.294086                                        LR 0.001000    Time 0.111725
Epoch: [19][  550/ 1012]    Overall Loss 0.294411    Objective Loss 0.294411                                        LR 0.001000    Time 0.111668
Epoch: [19][  560/ 1012]    Overall Loss 0.294173    Objective Loss 0.294173                                        LR 0.001000    Time 0.111604
Epoch: [19][  570/ 1012]    Overall Loss 0.294153    Objective Loss 0.294153                                        LR 0.001000    Time 0.111563
Epoch: [19][  580/ 1012]    Overall Loss 0.294451    Objective Loss 0.294451                                        LR 0.001000    Time 0.111511
Epoch: [19][  590/ 1012]    Overall Loss 0.294247    Objective Loss 0.294247                                        LR 0.001000    Time 0.111468
Epoch: [19][  600/ 1012]    Overall Loss 0.294086    Objective Loss 0.294086                                        LR 0.001000    Time 0.111423
Epoch: [19][  610/ 1012]    Overall Loss 0.294067    Objective Loss 0.294067                                        LR 0.001000    Time 0.111381
Epoch: [19][  620/ 1012]    Overall Loss 0.294201    Objective Loss 0.294201                                        LR 0.001000    Time 0.111335
Epoch: [19][  630/ 1012]    Overall Loss 0.293960    Objective Loss 0.293960                                        LR 0.001000    Time 0.111267
Epoch: [19][  640/ 1012]    Overall Loss 0.294124    Objective Loss 0.294124                                        LR 0.001000    Time 0.111218
Epoch: [19][  650/ 1012]    Overall Loss 0.293984    Objective Loss 0.293984                                        LR 0.001000    Time 0.111162
Epoch: [19][  660/ 1012]    Overall Loss 0.293730    Objective Loss 0.293730                                        LR 0.001000    Time 0.111119
Epoch: [19][  670/ 1012]    Overall Loss 0.293675    Objective Loss 0.293675                                        LR 0.001000    Time 0.111083
Epoch: [19][  680/ 1012]    Overall Loss 0.294043    Objective Loss 0.294043                                        LR 0.001000    Time 0.111051
Epoch: [19][  690/ 1012]    Overall Loss 0.294001    Objective Loss 0.294001                                        LR 0.001000    Time 0.111014
Epoch: [19][  700/ 1012]    Overall Loss 0.293943    Objective Loss 0.293943                                        LR 0.001000    Time 0.110990
Epoch: [19][  710/ 1012]    Overall Loss 0.294046    Objective Loss 0.294046                                        LR 0.001000    Time 0.110950
Epoch: [19][  720/ 1012]    Overall Loss 0.294187    Objective Loss 0.294187                                        LR 0.001000    Time 0.110913
Epoch: [19][  730/ 1012]    Overall Loss 0.294282    Objective Loss 0.294282                                        LR 0.001000    Time 0.110876
Epoch: [19][  740/ 1012]    Overall Loss 0.294543    Objective Loss 0.294543                                        LR 0.001000    Time 0.110850
Epoch: [19][  750/ 1012]    Overall Loss 0.294685    Objective Loss 0.294685                                        LR 0.001000    Time 0.110810
Epoch: [19][  760/ 1012]    Overall Loss 0.294658    Objective Loss 0.294658                                        LR 0.001000    Time 0.110775
Epoch: [19][  770/ 1012]    Overall Loss 0.294649    Objective Loss 0.294649                                        LR 0.001000    Time 0.110737
Epoch: [19][  780/ 1012]    Overall Loss 0.294585    Objective Loss 0.294585                                        LR 0.001000    Time 0.110692
Epoch: [19][  790/ 1012]    Overall Loss 0.294528    Objective Loss 0.294528                                        LR 0.001000    Time 0.110665
Epoch: [19][  800/ 1012]    Overall Loss 0.294613    Objective Loss 0.294613                                        LR 0.001000    Time 0.110661
Epoch: [19][  810/ 1012]    Overall Loss 0.294572    Objective Loss 0.294572                                        LR 0.001000    Time 0.110618
Epoch: [19][  820/ 1012]    Overall Loss 0.294865    Objective Loss 0.294865                                        LR 0.001000    Time 0.110589
Epoch: [19][  830/ 1012]    Overall Loss 0.294934    Objective Loss 0.294934                                        LR 0.001000    Time 0.110562
Epoch: [19][  840/ 1012]    Overall Loss 0.294989    Objective Loss 0.294989                                        LR 0.001000    Time 0.110527
Epoch: [19][  850/ 1012]    Overall Loss 0.295124    Objective Loss 0.295124                                        LR 0.001000    Time 0.110504
Epoch: [19][  860/ 1012]    Overall Loss 0.295317    Objective Loss 0.295317                                        LR 0.001000    Time 0.110476
Epoch: [19][  870/ 1012]    Overall Loss 0.295224    Objective Loss 0.295224                                        LR 0.001000    Time 0.110447
Epoch: [19][  880/ 1012]    Overall Loss 0.295169    Objective Loss 0.295169                                        LR 0.001000    Time 0.110420
Epoch: [19][  890/ 1012]    Overall Loss 0.295155    Objective Loss 0.295155                                        LR 0.001000    Time 0.110394
Epoch: [19][  900/ 1012]    Overall Loss 0.295399    Objective Loss 0.295399                                        LR 0.001000    Time 0.110379
Epoch: [19][  910/ 1012]    Overall Loss 0.295536    Objective Loss 0.295536                                        LR 0.001000    Time 0.110358
Epoch: [19][  920/ 1012]    Overall Loss 0.295419    Objective Loss 0.295419                                        LR 0.001000    Time 0.110338
Epoch: [19][  930/ 1012]    Overall Loss 0.295352    Objective Loss 0.295352                                        LR 0.001000    Time 0.110310
Epoch: [19][  940/ 1012]    Overall Loss 0.295218    Objective Loss 0.295218                                        LR 0.001000    Time 0.110284
Epoch: [19][  950/ 1012]    Overall Loss 0.295563    Objective Loss 0.295563                                        LR 0.001000    Time 0.110261
Epoch: [19][  960/ 1012]    Overall Loss 0.295651    Objective Loss 0.295651                                        LR 0.001000    Time 0.110249
Epoch: [19][  970/ 1012]    Overall Loss 0.295934    Objective Loss 0.295934                                        LR 0.001000    Time 0.110222
Epoch: [19][  980/ 1012]    Overall Loss 0.296039    Objective Loss 0.296039                                        LR 0.001000    Time 0.110215
Epoch: [19][  990/ 1012]    Overall Loss 0.295852    Objective Loss 0.295852                                        LR 0.001000    Time 0.110195
Epoch: [19][ 1000/ 1012]    Overall Loss 0.295951    Objective Loss 0.295951                                        LR 0.001000    Time 0.110185
Epoch: [19][ 1010/ 1012]    Overall Loss 0.296195    Objective Loss 0.296195                                        LR 0.001000    Time 0.110187
Epoch: [19][ 1012/ 1012]    Overall Loss 0.296139    Objective Loss 0.296139    Top1 85.135135    Top5 98.108108    LR 0.001000    Time 0.110134
--- validate (epoch=19)-----------
28770 samples (256 per mini-batch)
Epoch: [19][   10/  113]    Loss 0.326720    Top1 86.250000    Top5 98.476562
Epoch: [19][   20/  113]    Loss 0.320755    Top1 85.507812    Top5 98.437500
Epoch: [19][   30/  113]    Loss 0.324655    Top1 85.625000    Top5 98.307292
Epoch: [19][   40/  113]    Loss 0.324007    Top1 85.615234    Top5 98.271484
Epoch: [19][   50/  113]    Loss 0.321523    Top1 85.757812    Top5 98.281250
Epoch: [19][   60/  113]    Loss 0.325594    Top1 85.390625    Top5 98.229167
Epoch: [19][   70/  113]    Loss 0.328396    Top1 85.267857    Top5 98.208705
Epoch: [19][   80/  113]    Loss 0.325646    Top1 85.385742    Top5 98.271484
Epoch: [19][   90/  113]    Loss 0.325350    Top1 85.447049    Top5 98.272569
Epoch: [19][  100/  113]    Loss 0.327869    Top1 85.347656    Top5 98.281250
Epoch: [19][  110/  113]    Loss 0.327079    Top1 85.394176    Top5 98.306108
Epoch: [19][  113/  113]    Loss 0.327506    Top1 85.345846    Top5 98.300313
==> Top1: 85.346    Top5: 98.300    Loss: 0.328

==> Confusion:
[[ 856    0    1    1   13    3    0    0    4  101    0    5    0    1    2    2    0    2    0    3   17]
 [   4  973    1    4    5   21    3   11    0    2    2    1    0    0    3    4    1    2    7    5   12]
 [   3    2  963   10    2    1   22    6    1    2    2    3    2    6    3    5    2    1    5    5   11]
 [   3    1   18  935    1    2    3    1    0    0    6    1    2    1   27    1    1    3   12    1   11]
 [  12   11    3    4  932    6    0    1    2    6    0    3    0    2    6    3    7    0    0    3   12]
 [   1   49    1    8    5  915    3   39    2    2    1   10    4    7    2    1    0    1    2    5   14]
 [   2    2   19    0    0    2 1046    2    0    0    0    1    2    0    1    8    0    2    0   11    7]
 [   1   15   11    2    0   24    7  922    1    1    0    3    1    0    0    2    0    0   28   21    7]
 [   9    4    0    2    1    3    0    2  941   42    6    1    0   14   13    3    2    0    0    0    5]
 [  38    0    1    0    4    2    1    0   27  923    1    2    0   21    5    3    0    1    0    2    8]
 [   4    4    5    5    1    2    4    5   26    1 1009    0    2   11    3    1    2    1   14    4   15]
 [   2    1    0    0    0   11    2    4    0    0    1  924   20    5    0   11    4    4    0   22   13]
 [   1    0    2    7    2    1    0    2    2    0    0   47  905    1    3    6    5   22    1    5   41]
 [   1    0    3    0    6   15    0    1   10    9    4    8    1  916    5    4    3    0    0    8   15]
 [   4    3    1   19   10    1    1    0   26    4    2    0    2    1  944    0    3    2    8    2   16]
 [   3    2    3    2    4    1    4    1    1    0    0    6    2    2    1  982    6    7    0    3    8]
 [   1    6    2    2   11    2    0    0    1    3    1    3    1    1    3   11  997    1    0    4   30]
 [   6    1    1    9    0    1    3    1    1    0    0    9   21    5    0   19    1  969    0    4    8]
 [   2    4    5   21    1    0    0   16    2    0    6    1    1    0   15    1    1    0  937    2   10]
 [   0    2    0    0    1    2    5    9    0    0    0   12    3    4    0    5    8    3    0 1011    6]
 [ 114  152  124   64   84  107   43  118   80   91   63   94  206  243  100   84   88   28  127  197 5554]]

==> Best [Top1: 85.666   Top5: 98.377   Sparsity:0.00   Params: 169472 on epoch: 16]
Saving checkpoint to: logs/2023.01.28-232929/qat_checkpoint.pth.tar
--- test ---------------------
29787 samples (256 per mini-batch)
Test: [   10/  117]    Loss 0.326919    Top1 85.390625    Top5 98.476562
Test: [   20/  117]    Loss 0.337157    Top1 85.312500    Top5 98.378906
Test: [   30/  117]    Loss 0.336248    Top1 85.403646    Top5 98.385417
Test: [   40/  117]    Loss 0.341029    Top1 85.175781    Top5 98.339844
Test: [   50/  117]    Loss 0.343385    Top1 84.960938    Top5 98.343750
Test: [   60/  117]    Loss 0.347126    Top1 84.960938    Top5 98.365885
Test: [   70/  117]    Loss 0.347996    Top1 84.877232    Top5 98.342634
Test: [   80/  117]    Loss 0.348859    Top1 84.882812    Top5 98.300781
Test: [   90/  117]    Loss 0.350195    Top1 84.791667    Top5 98.355035
Test: [  100/  117]    Loss 0.350307    Top1 84.800781    Top5 98.351562
Test: [  110/  117]    Loss 0.348503    Top1 84.747869    Top5 98.338068
Test: [  117/  117]    Loss 0.350601    Top1 84.728237    Top5 98.324773
==> Top1: 84.728    Top5: 98.325    Loss: 0.351

==> Confusion:
[[ 874    1    4    1   14    3    1    1   10  121    2    0    1    4    4    3    2    1    1    0   11]
 [   2 1023    3    3    8   23    1   16    1    1    0    0    0    3    2    3    3    2   11    7   19]
 [   6    2  998   20    3    1   26   12    0    4    2    0    0    1    3    4    1    1    6    3    8]
 [   5    0   21  968    0    3    3    0    0    0   13    0    1    1   28    2    2    4   13    3    7]
 [  13   11    1    0  976    8    0    0    0   17    0    1    1    2    4    2    9    0    0    2   12]
 [   1   43    1    6    6  946    2   36    2    1    1    8    4   16    1    0    3    2    5    5   15]
 [   0    0   24    1    0    2 1086    3    0    0    1    0    0    1    1    7    0    0    0    6    5]
 [   1   15    9    1    5   36    3 1030    1    1    0    4    0    2    1    1    0    1   34    9   13]
 [   4    4    1    1    0    0    0    0  922   37    6    0    1   11   19    1    3    1    6    0   12]
 [  34    0    1    3    4    2    0    3   39  934    0    0    1   23    8    4    1    0    1    3   13]
 [   2    3    6   11    0    2    4    2   23    2  952    0    1   15    8    1    3    0   20    6   16]
 [   2    1    4    0    0   19    4    5    1    0    0  986   22    5    3    5    3    8    1   24   14]
 [   5    1    4    8    1    2    3    4    2    0    4   74  943    0    3   13   11   16    3   10   30]
 [   0    1    1    0    1    8    1    1   20   15    7   11    2  974    1    4    5    4    0    7   14]
 [   9    1    3   24   11    0    0    6   36    3    6    3    2    0 1011    1    4    1    9    1   12]
 [   1    0    1    1    1    1    4    0    1    0    0    6    6    2    1 1031    8    8    0    7   16]
 [   3    4    1    0    4    1    0    0    5    0    0    3    6    0    5   18 1063    1    1    7   21]
 [   1    0    1    8    0    1    4    0    6    0    0   12   19    2    4   18    2  924    0    2    7]
 [   0    5    7   13    2    3    0   33    4    0    0    2    0    0    5    1    1    0 1015    2    8]
 [   0    2    0    1    1    1    9    6    0    0    0   12    6    1    2   10    8    0    0 1097   14]
 [ 116  192  142   79   99  141   53   95   95   58   74   89  254  218  108   65   87   40  150  151 5485]]


Log file for this run: /home/madlabs/Documents/ai8x-training/logs/2023.01.28-232929/2023.01.28-232929.log
2428928000it [1:22:13, 492318.03it/s]

5.屏幕适配

      5.1 屏幕与控制板管脚对应关系

FjNr314KSYmjmOeNYadgTyi2zgay

      增加复位与背光控制管教宏定义

#define MXC_GPIO_PORT_RST MXC_GPIO2
#define MXC_GPIO_PIN_RST MXC_GPIO_PIN_6
#define MXC_GPIO_PORT_BL MXC_GPIO2
#define MXC_GPIO_PIN_BL MXC_GPIO_PIN_7

#define TFT_SPI MXC_SPI0 // SPI port to use for TFT display
#define TFT_SPI_PORT MXC_GPIO0 /**< GPIO port for SPI peripheral pins. */
#define TFT_SPI_PINS \
    MXC_GPIO_PIN_5 | MXC_GPIO_PIN_6 | MXC_GPIO_PIN_7 /**< GPIO pins for SPI peripheral. */

#define TFT_DC_PORT MXC_GPIO2 /**< GPIO port for Data/Command signal. */
#define TFT_DC_PIN MXC_GPIO_PIN_7 /**< GPIO pin for Data/Command signal. */
#define TFT_SS_PORT MXC_GPIO2 /**< GPIO port for select signal. */
#define TFT_SS_PIN MXC_GPIO_PIN_4 /**< GPIO pin for select signal. */

      修改屏幕尺寸

#define screenWidth 240
#define screenHeight 240

      修改屏幕初始化接口

    /* Initialize TFT display */
    mxc_gpio_cfg_t gpio_in;
    mxc_gpio_cfg_t gpio_bl;
    mxc_gpio_cfg_t* gpio_in_p;
	mxc_gpio_cfg_t* gpio_bl_p;
	gpio_in_p = &gpio_in;
	gpio_bl_p = &gpio_bl;
    gpio_in.port = MXC_GPIO_PORT_RST;
    gpio_in.mask = MXC_GPIO_PIN_RST;
    gpio_in.pad = MXC_GPIO_PAD_PULL_UP;
    gpio_in.func = MXC_GPIO_FUNC_IN;
    /* Initialize TFT display */
    MXC_TFT_Init(gpio_in_p, gpio_bl_p);

      调整游戏界面尺寸及控件位置

    TFT_Print(buff, 50, 80, font_2, snprintf(buff, sizeof(buff), "Funpack 2-3"));
    TFT_Print(buff, 50, 100, font_2, snprintf(buff, sizeof(buff), "SNAKE GAME"));
    TFT_Print(buff, 10, 210, font_2, snprintf(buff, sizeof(buff), "PRESS PB1(SW1) TO GO"));

6.实验结果

FrWza1_yVCTCfjW7KRPGWaYVOO41

Fj-hYQQf5WkPl71Fjf2M8MAEbu6H

Fl5BvIHphfqvjQ7f65MpzGoXkiaL

7.遇到的问题

7.1开发环境用起来不方便

      工程中用到的一些外设模块文件都是以库的形式添加到工程中,如果要移植其他的第三方库,要自己编写makefile。例如本项目中原示例工程所用TFT驱动型号为ili9341,如果改为其他型号的屏幕,就要通过修改makefile的形式,将源文件添加到工程内。

Fqf_DXhR9FuDSOlhOO28KG2ETwuf

7.2工程配置中设置头文件路径不起作用

      项目起初打算GUI采用lvgl,但是需要自己编写makefile,由于之前没有接触过,所以放弃,尝试将lvgl直接放到工程目录内,但是执行编译时,并未编译lvgl相关源文件,且包含lvgl相关头文件,也提示找不到头文件。

Fp2jCvtwKgZgBM0wg9HJHB3mEaaa

Fkt4sDz8rJbCqawCgTdP0a_BW7jM

 

8.心得体会

     非常高兴能够参与这次活动,由于之前没有接触过linux与机器学习相关的知识,通过这次活动,从安装系统,到安装各种开发环境依赖包,再到最后的模型训练,不仅掌握了linux系统的基本使用方法,对机器学习相关知识也有了一定认识。以前认为机器学习都需要在电脑上或者性能非常高的硬件上来实现,没想到在MCU上也可以实现。

 

 

附件下载

training-log-2023.01.28-232929.log
训练记录
funpack_snake_game.rar
工程文件

团队介绍

王者

评论

0 / 100
查看更多
目录
硬禾服务号
关注最新动态
0512-67862536
info@eetree.cn
江苏省苏州市苏州工业园区新平街388号腾飞创新园A2幢815室
苏州硬禾信息科技有限公司
Copyright © 2023 苏州硬禾信息科技有限公司 All Rights Reserved 苏ICP备19040198号