David Packard Simulator
This document details how the simulation environment for the David Packard was built. First, let's look at a high level diagram of the various components. There are basically five main components in the simulator. They are:
- An Ubuntu 24 virtual machine named
sim-dpkd.shore.mbari.orgrunning some simulation software that pushes generated signals to the local serial ports. - Moxa NPort 16 Port RS-232 Serial Device Server that is connected to the VM mentioned above via Linux kernel drivers.
- Digi Connect TS16 that takes in physical connections from the outputs of the Moxa NPort.
- Digi Connect TS32 that takes in physical connections from the outputs of the Moxa NPort.
- Intel NUC computer running navproc/logr software that has the TS16 and TS32 ports mounted virtually as /dev/tty devices
- An Ubuntu 24.04 VM server named
coredata-dpkd-sim.shore.mbari.orgthat serves as the navproc-api server
Ubuntu 24.04] moxa[Moxa NPort
sim-moxa-dpkd
134.89.11.159] digi16[Digi TS 16] navproc-dpkd-sim.shore.mbari.org[navproc-dpkd-sim.shore.mbari.org
Ubuntu 22.04] coredata-dpkd-sim.shore.mbari.org[coredata-dpkd-sim.shore.mbari.org
Ubuntu 24.04] digi32[Digi TS 32] end sim-dpkd --> moxa --> digi16 --> navproc-dpkd-sim.shore.mbari.org moxa --> digi32 --> navproc-dpkd-sim.shore.mbari.org navproc-dpkd-sim.shore.mbari.org --> coredata-dpkd-sim.shore.mbari.org
The physical simulator hardware setup looks like the following:

Here is a diagram of the signal connections on the Packard Simulator
Moxa NPort 16
I wanted to make sure the Moxa was configured properly before setting up the signal simulator. In order to provide support for multiple serial ports on the sim-dpkd machine, we use a Moxa NPort 5610 which can then provide virtual serial ports. To configure the NPort for this simulator:
- I downloaded the NPort Administrator Suite to a separate Windows 11 machine and then downloaded the latest (v3.12) firmware ROM from the Moxa support site.
- I installed the NPort Administrator Suite and ran it. It automatically found the Moxa which had previously been configured with IP 192.168.127.254. It had v3.9 firmware on it.
- I used the NPort Administrator Suite to update the firmware.
- I fired up the NPort configuration utility again and it found the Moxa.
- I connected to it with the default username and password and enabled DHCP and then set the Netmask to 255.255.254.0, the gateway to 134.89.10.1, the DNS Server 1 to 134.89.10.10 and the DNS Server 2 to 134.89.12.87. I renamed it to
sim-moxa-dpkdso it was easily recognizable and then I saved. It automatically restarted with an IP address of 134.89.11.159. - When I connected over the web interface, it asked me for new password, so I set the password to match that of the 'ops' account on the navproc machines.
- You can find the NPort web page located here
sim-dpkd.shore.mbari.org
-
Submitted a help ticket to create a VM for the signal simulator. This should just be a vanilla Ubunutu 24 with Docker installed. The hostname of the machine will be
sim-dpkd.shore.mbari.org. Here is the ticket text:- Request_Submitted_By: <kgomes@mbari.org> - VM_Name: sim-dpkd - VM_Purpose: This VM will be used to run a Python Docker container that simulates signals for the current Navproc installation on the Rachel Carson. Peter, let's make this one Ubuntu 24 if that's OK with you. I am putting one year on this VM because it should only be in operation until we get the new infrastructure online. - VM_Expiration: 1 year - VM_Support: IS_Supported - VM_Support_Alt_Adm: - VM_OS: > Refer to Comments - CPU: 2 - CPU_Addl_Reason: - RAM: 4 - RAM_Addl_Reason: - GPU_Extra: - GPU: NO - Disk_Extra: - Network: Internal (SHORE) - Resource_Priority: Low - Resource_Priority_Reason: - Conf_Firewall: - Conf_Desktop: YES - Conf_Desktop_Extra: Could you also enable Remote Desktop on this machine? I will do most work via ssh, but it will be helpful to have Desktop access. - Conf_Logins: local - Conf_Docker: YES - Conf_Docker_Extra: - Conf_WebServer: none - Conf_sudo: sudo for kgomes local account. Can you also create a local account named 'ops' that has sudo too? This is the way Mike and I use Navproc now and we would like to do it this way so we can run the navproc/logging stuff as ops and then we can both login directly as ops and manage it. - Conf_vCenter_Access: - VM_Comments: Ubuntu 24 please.- Peter finished the VM installation and gave the
opsaccount the default password. - I needed to ssh into the VM first to make sure that the system prompted for a new password by running
ssh ops@sim-dkpd.shore.mbari.org - I changed the password to the normal ops password.
- I then brought up the Windows.app on my Mac and created a connection to this VM. It opened right to the initial setup of the Ubuntu installation
- I accepted mostly the defaults for the options in the setup.
- Note the default UI through the remote desktop is different from the normal one.
- I ran the software updater to get everything up to date.
- I opened up the
Settingsapplication and under the Power settings I set Screen Blank to Never. - I then opened a terminal and ran
sudo apt-get install build-essential cmake python3-pip python3-venv -y - Next, I downloaded the 'Real TTY Linux Kernel 6.x Driver' from the Moxa website into the Downloads folder. Then I ran
tar -xvf moxa-real-tty-linux-kernel-6.x-driver-v6.1.tar cd moxa sudo ./mxinst cd /usr/lib/npreal2/driver sudo ./mxaddsvr 134.89.11.159 16
- Peter finished the VM installation and gave the
-
This creates the virtual serial ports at /dev/ttyr00->0f.
- The code for the Python simulator is located in BitBucket here and it was checked out to the
/optdirectory by changing to the /opt directory and runningsudo git clone https://kgomes@bitbucket.org/mbari/corenav-simulators.git(I used mydev-checkoutapp password). - I then ran
sudo chown -R ops:ops corenav-simulatorsto change over to the ops account for ownership - I then
cd corenav-simulatorsand ranmkdir logsto create a directory where log files will go - Next, I needed to grab some log files from the Carson so I could replay them. I cd'd into the
/opt/corenav-simulators/datadirectory and ran:mkdir log-files cd log-files mkdir carson cd carson scp ops@rcnavproc1.rc.mbari.org:/home/ops/corelogging/rc/data/2024339* . scp ops@rcnavproc1.rc.mbari.org:/home/ops/corelogging/rc/data/2024340* . scp ops@rcnavproc1.rc.mbari.org:/home/ops/corelogging/rc/data/2024341* . sudo gunzip *.gz
Warning
When first setting this simulator up, we did not have any data from the Packard. Eventually, these data files should be replaced by the proper Packard instrument generated files so the simulated signals are correct.
-
Then in
/opt/corenav-simulators, I edited the simulator_config.json file to look like the entries below.{ "name": "David Packard Data Simulators", "version": "0.1", "description": "Simulator for data that should be coming from the David Packard", "author": "Kevin Gomes", "logging-level": "DEBUG", "logging-format": "%(asctime)s: %(message)s", "logging-datefmt": "%H:%M:%S", "try-to-send-over-serial": true, "simulators": [ { "name": "logr_simulator", "config": { "type": "logr-file-reader", "log-dir": "./data/log-files/carson/", "file-mapping": { "2024340csprawfulllogr.dat": { "port": "/dev/ttyr09", "baudrate": 9600, "parity": "N", "stopbits": 1, "bytesize": 8 }, "2024340gtdprologr.dat": { "port": "/dev/ttyr06", "baudrate": 9600, "parity": "N", "stopbits": 1, "bytesize": 8 }, "2024340lodestarlogr.dat": { "port": "/dev/ttyr05", "baudrate": 9600, "parity": "N", "stopbits": 1, "bytesize": 8 }, "2024340nav4dlogr.dat": { "port": "/dev/ttyr0a", "baudrate": 9600, "parity": "N", "stopbits": 1, "bytesize": 8 }, "2024340nmeafulllogr.dat": { "port": "/dev/ttyr03", "baudrate": 9600, "parity": "N", "stopbits": 1, "bytesize": 8 }, "2024340seabirdctdfulllogr.dat": { "port": "/dev/ttyr04", "baudrate": 9600, "parity": "N", "stopbits": 1, "bytesize": 8 }, "2024340shipgyrofulllogr.dat": { "port": "/dev/ttyr02", "baudrate": 4800, "parity": "N", "stopbits": 1, "bytesize": 8 }, "2024340uhsmsgfulllogr.dat": { "port": "/dev/ttyr07", "baudrate": 9600, "parity": "N", "stopbits": 1, "bytesize": 8 } } } } ] } -
In order to just test this, I ran
./simulator.shin the/optdirectory. Once I verified data was being generated properly, I kill the python process. -
Now to get this to run as a service, I created a service startup file
/etc/systemd/system/corenav-simulators.servicewhich looks like:[Unit] Description=Python scripts to simulate data for corenav After=network.target [Service] Type=forking ExecStartPre=/bin/sleep 30 ExecStart=/opt/corenav-simulators/simulator.sh Restart=Always [Install] WantedBy=default.target -
The service can then be enabled by running the following:
sudo systemctl daemon-reload sudo systemctl enable corenav-simulators.service sudo systemctl start corenav-simulators.service -
I then rebooted the machine to make sure the simulators started properly
Warning
Once, after doing a standard upgrade on the Ubuntu installation, the serial ports would fail to be recognized. I had to uninstall the driver by running sudo ./mxuinst from the ~/Downloads/moxa/ directory and then running sudo ./mxinst again. After that, then run cd /usr/lib/npreal2/driver and then run sudo ./mxaddsvr 134.89.11.159 16 to add the virtual ports
sim-ts16-dpkd (Digi TS16)
The next in line is a Digi TS 16 terminal server. The serial cables from the Moxa are routed to the terminal server which is then virtually mounted on the computer that is running the navproc and logr code.
The webpage for the Digi TS 16 terminal server can be found here and ask Kevin Gomes for login credentials if you need them.
Some support documents:
Installation Steps:
- Before starting, I took a picture of the label so I could get the MAC address off it before installing it in the rack. The MAC address is
00409DD38856 - I wanted to do a factory reset to clear any settings. Before plugging in the power, I held the reset button and then plugged in the power plug. I held the reset button for about 30 seconds and the LED on the back started blinking in a 1-5-1 pattern so I released the reset button.
- I downloaded the Digi Discovery Tool to my windows machine and ran it.
- I could see the Digi was assigned an IP address of 134.89.11.118 so I double clicked on it to open up the settings and it opened the web page interface. I logged in using the default user of
rootwith password that was printed on the label with the MAC address. - Under the Network->Advanced Network Settings, I changed the
Host Nametosim-ts16-dpkdand the DNS servers to134.89.10.10and134.89.12.87and then clicked on Apply. - I then went to
Rebootand rebooted the terminal server. - It had the most recent firmware so I did not do a firmware update.
- Under Users->root, I changed the password to match the
opspassword we use. - Under System, I set the Device Identity Settings for Description and Contact.
- Under System, I set the Date and Time to use an UTC offset of -08 hours and set the first time source to use
time-sh1.shore.mbari.org. - I rebooted again, but time and date did not update so I set it manually.
sim-ts32-dpkd (Digi TS32)
- Before starting, I took a picture of the label on the outside so I would have the MAC address.
- I then installed the TS in the rack, connected both ethernet ports to the LAN and then powered it on.
- Went to the Digi website and downloaded the "Digi Navigator" for Windows 11.
- I installed the application with the default settings, added a desktop shortcut and ran the utility.
- I could see from the list in the application, it looked like there was something called "EZ32-001007", but it had two IP addresses underneath it. I'm assuming because I have two cables plugged in. The IPs are quite different though. One is
134.89.10.214and the other is134.89.11.32. Hmmm. I want to make sure I am not picking up one of the other 32 port TSs. I uplugged the top cable and the134.89.10.214address disappeared so that was easy. I plugged it back in and it came back. I unplugged the other cable and other IP disappeared so I plugged it back in. At least I know I found the right one. I unplugged the second - I selected the 134.89.11.32 IP and then clicked on
Configure Device for RealPort. - I entered the user
adminand the password printed on the label. I message box popped up sayingOK. - I then clicked on the
HTTPSbutton which opened a web page and I logged in usingadminand the password on the label - First, I went to 'admin' -> 'change password' menu and changed the password. NOTE: I couldn't use the standard ops password as it wasn't secure enough so I changed it a little and added it to my BitWarden application under the SE-IE shared password.
- First, I went to the System -> Firmware Update menu and then selected 'Download from server'. There was an update so I clickec 'Update firmware'. I updated automatically and rebooted and then I logged in again.
Intel NUC simnavproc.shore.mbari.org
The next component in the chain is an Intel NUC machine that will run the Navproc, Logr and LCM bridge software and is named navproc-dpkd-sim.shore.mbari.org. I used the standard Navproc Computer Setup to get all the software installed. Please consult that documentation which will install all the necessary software and tools. Once that is all complete, but before the processes are started, I needed to set up the connection to the DigiTS and get the signals routed to the correct serial ports.
One difference in the setup is that we are using a TS16 and a TS32 so after setting up the TS16, I took the following setups.
Signal Connections
Signal Details
| Simulated Device | navproc-dev-ubunutu-22 source | Protocol/COM Port | Baud Rate | Data Bits | Stop Bits | Parity | Moxa IP Address | Moxa Port | Notes |
|---|---|---|---|---|---|---|---|---|---|
| /dev/ttyr00 | 134.89.10.247 | 1 | |||||||
| GPS | nmeaGPS_sim.py | /dev/ttyr01 | 9600 | 8 | 1 | None | 134.89.10.247 | 2 | |
| CTD | file_reader_sim.py | /dev/ttyr02 | 9600 | 8 | 1 | None | 134.89.10.247 | 3 | |
| Gryo | file_reader_sim.py | /dev/ttyr03 | 9600 | 8 | 1 | None | 134.89.10.247 | 4 | |
| /dev/ttyr04 | 134.89.10.247 | 5 | |||||||
| /dev/ttyr05 | 134.89.10.247 | 6 | |||||||
| /dev/ttyr06 | 134.89.10.247 | 7 | |||||||
| /dev/ttyr07 | 134.89.10.247 | 8 | |||||||
| GPS | nmeaGPS_sim.py | /dev/ttyr08 | 9600 | 8 | 1 | None | 134.89.10.247 | 9 | |
| CTD | file_reader_sim.py | /dev/ttyr09 | 9600 | 8 | 1 | None | 134.89.10.247 | 10 | |
| Gyro | file_reader_sim.py | /dev/ttyr0a | 9600 | 8 | 1 | None | 134.89.10.247 | 11 | |
| /dev/ttyr0b | 134.89.10.247 | 12 | |||||||
| /dev/ttyr0c | 134.89.10.247 | 13 | |||||||
| /dev/ttyr0d | 134.89.10.247 | 14 | |||||||
| /dev/ttyr0e | 134.89.10.247 | 15 | |||||||
| /dev/ttyr0f | 134.89.10.247 | 16 |
API Server
The next step was to configure a server in the VM cluster that will act as an API server for the data coming from navproc. This API consists of a ZMQ proxy, a Redis server, a UDP proxy that replicates the old bcserver, a Telegraf socket that feeds an InfluxDB database and a Grafana server. Here is a basic diagram of the API services and data flow.
- I submitted a ticket to have IS build a Ubuntu 20.04 and Peter built the VM
dp-sim-api.shore.mbari.organd gave Mike and I sudo privs on it. - ssh’d into the dp-sim-api server as me
- I checked to make sure openssh-client was installed by running
apt list openssh-client(it was installed) - I started an ssh-agent by running
eval $(ssh-agent)which return the pid of the agent. - I changed into the
/etc/ssh/keys/kgomesdirectory by runningcd /etc/ssh/keys/kgomesso my generated key would end up here. - I then created a new ssh key by running
ssh-keygen -t ed25519 -b 4096 -C "{kgomes@mbari.org}" -f ops_as_kgomes(I used the login password for ops as they key password). This created theops_as_kgomesprivate key and theops_as_kgomes.pubpublic key. - I added the ssh key to the agent by running
ssh-add /etc/ssh/keys/kgomes/ops_as_kgomes(had to enter the password) -
I then created a
/etc/ssh/keys/kgomes/configfile and added the following:Host bitbucket.org AddKeysToAgent yes IdentityFile /etc/ssh/keys/kgomes/ops_as_kgomes -
This finishes the setup of my ssh key on the api machine. Now I need to add the ssh key to BitBucket. I logged into BitBucket, went to workplace settings and then clicked on SSH Keys. Then I clicked on Add Key, gave it a name of “From dp-sim-api.shore.mbari.org as kgomes”, and then back in the terminal, in the .ssh directory, ran
cat ops_as_kgomes.pub, selected, copied and pasted the entire string into the BitBucket Key window. Now I should have access to the repositories I need. - Now, to install the api software, I went to the opt directory by running
cd /opt - I created the corenav directory using
sudo mkdir corenav - Changed it to be owned by ops account using
sudo chown ops corenav. - I then switched to the
opsaccount by runningsudo -u ops -i. - Then I went into the corenav directory by running
cd corenav. - I then checked out the API repo by running
git clone git@bitbucket.org:mbari/navproc-api.git(I used an app password from my kgomes account) - I then cd'd into the navproc-api directory.
- Before starting everything, I needed to copy the .env.template file to a file named .env and then edit it to set the passwords and such for the API.
- Now everything should be ready to go and I started up the API by running
docker-compose up -dwhich runs everything in the background. You can double check this by runningdocker psand you should see 5 containers running.