Workshops and Tutorials
"Fundamental Tools for Data Scientists - SHELL, GIT, Python and SQL".
Training at the 2015 Software Carpentry Hands-on, May 4-7, 2014, Lawrence Berkeley National Laboratory.
Instructors: D. Ushizima, G. Wilson, D.Winston
Data science is called "the sexiest job of the 21st century", according to market demands [link]. If this area is of utmost importance, why don't we have more courses at LBNL that address related skills? As a CAMERA co-PI and the UC Berkeley Data Scientist fellow at LBNL, D. Ushizima has supported initiatives such as Software Carpentry in order to help bring others up to speed on key data science knowledge.
"High Performance Software, Workflows, and Visualization for Synchrotrons".
Workshop at the 2014 ALS User Meeting, October 7-8, 2014, Lawrence Berkeley National Laboratory.
Organizers: Alexander Hexemer, Dula Parkinson, David Shapiro, Alastair MacDowell (ALS-LBNL), Craig Tull (CRD-LBNL)
Beamlines require robust, high performance software for analysis and visualization. This need is driven by high data rates and volumes, and by dynamic experiments where researchers must have feedback during beamtime. A flexible and highly customizable work flow framework could be used across many beamlines and synchrotrons. Beamline scientists and the user community could leverage this framework, making it significantly easier for them to provide high performance analysis tools to general users. This workshop will investigate the current state of the art in workflow, analysis, and visualization tools.
"CAMERA-SHARP real-time robust ptychographic imaging".
Workshop at the 2014 ALS User Meeting. Wed. Oct 8, 9 am-5 pm, Lawrence Berkeley National Laboratory, Berkeley CA.
Organizers: S. Marchesini, D. Shapiro, H. Krishnan F. Maia
By combining diffraction with microscopy and the computational power of GPUs, one can quickly turn high throughput "imaging by diffraction" techniques into the sharpest images ever produced. Every scanning microscope can now add a parallel detector, and every diffraction imaging instrument can add a scanning stage to take advantage of this disruptive technique. To sustain high throughput processing (~>TB/h) the CAMERA collaboration has recently released a software suite for high throughput real time ptychographic data analysis which is freely available* and open for collaborative development. The workshop is intended to be accessible to new users who wish to learn about ptychographic imaging analysis, as well as experienced scientists wishing to learn more about available analysis tools for real-time analysis of murky experimental data. Live demonstrations and tutorials of the software packages will be performed where possible to help new users. We encourage participants to bring ptychographic data for demonstrations of the reconstruction process and to also present demonstrations of their own reconstruction methods. The workshop will be organized as a few brief morning talks followed by a long interactive tutorial session.