Experiment software to replicate a VR foraging task described in the paper "Phasic and tonic pain serve distinct functions during adaptive behaviour"
In this repo, we share all software tools to replicate the experiment described in the paper "Phasic and tonic pain serve distinct functions during adaptive behaviour". You can run the experiment and analyse collected data with scripts from this repo: https://github.com/ShuangyiTong/Phasic-and-tonic-pain-serve-distinct-functions-during-adaptive-behaviour/tree/main.
For any questions, you can open an issue or send an email to Shuangyi Tong <shuangyi.tong@ndcn.ox.ac.uk>.
Although we fully disclose our software, you still need to have the right hardware to replicate the experiment. All hardware are commercially available. Even some of them might be discontinued, you should be able to find alternatives easily. Contact us if you have trouble finding the right hardware.
A Windows PC is required. We were using Alienware Aurora R11 (Intel i9-11900KF, Nvidia RTX 3090, 64GB RAM) to run the VR games. We were using Windows 10 Pro, but Windows 11 should be fully compatible.
The VR system we were using is HTC VIVE Pro Eye. The key compatibility concern of using other headset system is we used SRanipal eye tracking API. If you can find headset that supports this API and is compatible with SteamVR, then it is likely to work with our released game build.
We also used the VIVE Pro Wireless Adapter instead of connecting to the PC directly with the DP cable.
We use National Instruments USB-6212-BNC data acquisition device to generate voltage signal to control the Digitimer DS5 isolated bipolar constant current stimulator. Other models of NI USB DAQ devices might work.
In experiment 1, the physiological sensor is based on a Arduino Nano 33 IOT microcontroller. As described in the paper, a separate GSR sensor needs to be purchased. We used the Grove GSR sensor by Seeed Studio. Other microcontrollers with network module and ADC channels should work as well, though one might need to change the networking calls.
We used a single BrainProducts LiveAmp system (32 channels) to record EEG. The BrainVision software is required to record the data. A wireless trigger box is also required to send trigger wirelessly.
We used VBM tourniquet cuffs and their hand inflator to inflate the cuffs.
We used speech recognition to record pain ratings at the end of each trial. You need a Microsoft Azure API key to run this. Alternatively, you can just record participants ratings by hand.
As participants carry many wireless devices and move around, it is recommended to find a vest with many pockets and velcro mounts to attach all devices securely.
Download the control program from release. Activate relevant experiment control scripts.
Physiological data with Arduino device: Download firmware source for Arduino Nano 33 IOT from release. Configure WiFi SSID and password to connect to local network. Set correct host (where the control program is running) IP address. Write the firmware with Arduino studio. Once the program start, it should automatically connect to WiFi and register itself on the control program.
EEG with BrainProducts LiveAmp: Start LiveAmp recording with BrainVision Recorder. Download wireless trigger software from release. Launch the wireless trigger software, it should automatically register it self on the control program.
Download the stimulator control program from release. Start the stimulator control program.
Download the speech recognition program from release. Start the speech recognition program.
Download the VR program from release. Start the VR program. If you are using the same hardware we were using, you also need to start VIVE wireless program first. You might also want to perform eye tracking calibration. SteamVR will be launched when you start the VR program.
Click file on the top menu of the main control program. Click Save All Data. It is recommended to save data in a separate new folder with the folder name matches the saving name. The folder then can be directly copied to our analysis codebase's datasets folder. A data notes file is required. For the format of the data notes file, one can refer to our released raw data.