Both sides previous revision
Previous revision
Next revision
|
Previous revision
|
hardware:computer [2021/02/13 21:28] Jon Daniels [Acquisition] |
hardware:computer [2024/02/14 17:55] (current) Jon Daniels [Data Analysis] |
| |
100 MB/sec is typical for a magnetic hard drive. 300 MB/sec is typical for a single SSD. If the data rate is too high for a single SSD, use SSDs in RAID0 configuration (e.g. 4 SSDs in RAID0 can achieve >1 GB/s). Lately M.2 drives with PCIe interface with comparable speeds to a RAID0 with SSDs have become available and might be a good option. To benchmark your PC's hard drive write speed you can use [[http://crystalmark.info/?lang=en | Crystal Disk Mark]]. I'm pretty sure the relevant score to diSPIM acquisition is the "Seq" "Write" score (Sequential (Block Size=1MiB) Read/Write with single Thread), at least for Micro-manager software with typical acquisition settings. | 100 MB/sec is typical for a magnetic hard drive. 300 MB/sec is typical for a single SSD. If the data rate is too high for a single SSD, use SSDs in RAID0 configuration (e.g. 4 SSDs in RAID0 can achieve >1 GB/s). Lately M.2 drives with PCIe interface with comparable speeds to a RAID0 with SSDs have become available and might be a good option. To benchmark your PC's hard drive write speed you can use [[http://crystalmark.info/?lang=en | Crystal Disk Mark]]. I'm pretty sure the relevant score to diSPIM acquisition is the "Seq" "Write" score (Sequential (Block Size=1MiB) Read/Write with single Thread), at least for Micro-manager software with typical acquisition settings. |
| |
| Light sheet can generate lots of data very quickly, and it is important to have a plan to deal with the deluge. This often involves support from the institution's IT department. A helpful discussion of the challenges and options is the article [[https://arxiv.org/abs/2108.07631v1|Biologists need modern data infrastructure on campus]]. |
| |
| |
===== Data Analysis ===== | ===== Data Analysis ===== |
| |
Having lots of RAM speeds the analysis; ideally the entire dataset can be held in active memory. Ideally get a computer with CUDA-capable graphics card because some of the data analysis software can take advantage of it to speed the computation (OpenCL is a competing framework for GPU computation). This is a nascent area and depends on software support; many software developments data analysis are forthcoming so it's hard to say exactly what will be the best hardware in the long run. | Some users do image analysis and processing on a separate workstation, others use the acquisition computer when it's not being used for acquisition. |
| |
| Having lots of RAM can speed any analysis; ideally the entire dataset can be held in active memory. Ideally get a computer with CUDA-capable graphics card because some of the data analysis software can take advantage of it to speed the computation (OpenCL is a competing framework for GPU computation). This is a nascent area and depends on software support; many software developments data analysis are forthcoming so it's hard to say exactly what will be the best hardware in the long run. |
| |
| Micro-Manager 2.0 has a helpful ability to reslice data into the "lab frame" which is helpful especially for stage scanning data. The GPU version of the algorithm can operate on datasets up to 1/4 of the GPU memory, e.g. 2 GB datasets can be processed on a GPU with 8 GB of working memory. |
| |
===== Specific suggestions ===== | ===== Specific suggestions ===== |