C++ Programmer's Guide#
If you are new to pylon, Basler recommends making yourself familiar with the pylon C++ API first by reading the Getting Started section of the pylon C++ Programmer's Guide.
C++ Programming Samples#
To open the folder containing the programming samples for Stereo mini cameras, press the Win key, go to the Basler folder, and choose Stereo mini Samples. A File Explorer window opens. The C++ programming samples are located in the cpp folder.
The C++ samples are located in the /opt/pylon/share/pylon/Samples/stereo-mini/cpp folder.
The following Stereo mini C++ samples are available:
- FirstSample: Grabs intensity and range data and prints center pixel values.
- Opens the first Stereo mini device using
BaslerGenTlStmDeviceClass. - Enables intensity and range components.
- Reads center pixel intensity (RGBA) and range values.
- Opens the first Stereo mini device using
- GrabDepthMap: Converts range data to a colorized depth map using OpenCV.
- Uses the range component (
Coord3D_C16). - Colorizes depth values with OpenCV for live visualization.
- Uses the range component (
- ShowPointCloud: Displays live colored point clouds with PCL.
- Uses the range component (
Coord3D_ABC32f) and intensity for color. - Converts grab results to
pcl::PointCloud<pcl::PointXYZRGB>. - Displays a live point cloud viewer.
- Uses the range component (
- SavePointCloud: Saves one colored point cloud as a .ply file.
- Captures one frame and exports a colored point cloud to
pointcloud_color.ply.
- Captures one frame and exports a colored point cloud to
Prerequisites for Building the Samples#
- Visual Studio 2019 or above
- pylon Software Suite with Stereo mini supplementary package
- Optional: OpenCV library
- Optional: Point Cloud Library (PCL)
- Optional: Boost libraries
- pylon Software Suite with Stereo mini supplementary package
- CMake 3.3 or above
- A C++ compiler (for example,
gcc/g++) - Optional: OpenCV
- Optional: Point Cloud Library (PCL)
- Optional: Boost libraries
On an Ubuntu system, you can install the prerequisites by issuing the following command:
How to Build the Samples#
Info
Before building the samples, copy the sample folder to a location where you have read and write access.
CMake is used as the build system for all C++ samples.
- Start Visual Studio.
- On the start page, choose Open Folder (or Open a local folder) and navigate to the copied sample folder.
- Open the cpp subfolder and click the Select Folder button.
- Choose your preferred configuration from the configuration dropdown.
- Press Ctrl+Shift+B to build all samples.
- Start CMake GUI.
- In the Where is the source code field, enter the path to the cpp subfolder of your samples.
- In the Where to build the binaries field, enter a path where you want to store the build output.
- Click Configure.
- Choose your Visual Studio version from the dropdown and click Finish.
- Click Generate.
- In the File Explorer, navigate to your build folder and double-click the generated Visual Studio solution file (.sln) to open it.
- In Visual Studio, press Ctrl+Shift+B to build all samples.
- Create a build directory:
mkdir build && cd build - Run CMake:
cmake .. - Build the samples:
make - Optionally, install the samples:
sudo make install
Installing the OpenCV Library#
Download OpenCV from https://github.com/opencv/opencv/releases.
Make sure that you use an OpenCV version that matches your Visual Studio toolchain. If no matching prebuilt package is available, build OpenCV from source with your Visual Studio version.
If CMake doesn't find OpenCV automatically, set OpenCV_DIR to the directory that contains OpenCVConfig.cmake.
Example (CMake command line):
Adding OpenCV to PATH Environment Variable#
Add the OpenCV bin directory to PATH, for example:
Installing the Point Cloud Library#
Download PCL from https://github.com/PointCloudLibrary/pcl/releases.
Make sure that the PCL version matches your Visual Studio toolchain.
If CMake doesn't find PCL automatically, set PCL_DIR to the directory that contains PCLConfig.cmake.
Example (CMake command line):
If required by your installation method, add the PCL bin directory to PATH.
If CMake doesn't find PCL, install the PCL development package or build PCL from source.
On Ubuntu systems, install the package with:
If PCL is installed in a non-standard location, set PCL_DIR accordingly:
Info
Samples using PCL can crash depending on the installed PCL/VTK versions because of a known VTK issue used by PCL visualization. Source: https://github.com/PointCloudLibrary/pcl/issues/5237 Use PCL 1.13.1 or newer when possible.
Running the Samples#
Run an executable from the build output folder, for example:
./FirstSample/FirstSample
./GrabDepthMap/GrabDepthMap
./ShowPointCloud/ShowPointCloud
./SavePointCloud/SavePointCloud
How to Build Applications#
Use CMake to create your own Stereo mini C++ applications, as demonstrated by the sample set provided.
Using IDE Projects#
For Visual Studio-based projects, refer to: Common Settings for Building Applications with pylon (Windows)
For Linux-based builds, refer to: Common Settings for Building Applications with pylon (Linux)
Using CMake#
The Stereo mini samples use a CMake setup with:
find_package(pylon 10.0 REQUIRED)- optional
find_package(OpenCV QUIET)for depth map visualization - optional
find_package(PCL QUIET)for point cloud visualization
A minimal CMake project for a Stereo mini application can look like this:
If your application uses the same optional dependencies as the samples, add them conditionally:
find_package(OpenCV QUIET)
find_package(PCL QUIET)
if (OpenCV_FOUND)
target_include_directories(stereo_mini_app PRIVATE ${OpenCV_INCLUDE_DIRS})
target_link_libraries(stereo_mini_app PRIVATE ${OpenCV_LIBS})
endif()
if (PCL_FOUND)
target_include_directories(stereo_mini_app PRIVATE ${PCL_INCLUDE_DIRS})
target_compile_definitions(stereo_mini_app PRIVATE ${PCL_DEFINITIONS})
target_link_libraries(stereo_mini_app PRIVATE ${PCL_LIBRARIES})
endif()
To build from the command line:
Accessing Camera Features#
Opening a Device#
Use the Pylon::CTlFactory::CreateFirstDevice() method to create a camera object for the first available Stereo mini camera. Always specify the pylon device class identifier for Stereo mini cameras to prevent CreateFirstDevice() from creating a camera object for a different type of camera:
// Open first available Stereo mini camera
Pylon::CStereoMiniInstantCamera camera(
Pylon::CTlFactory::GetInstance().CreateFirstDevice(
Pylon::CDeviceInfo().SetDeviceClass(Pylon::BaslerGenTlStmDeviceClass)));
std::cout << "Using device: " << camera.GetDeviceInfo().GetFriendlyName() << std::endl;
camera.Open();
Acquiring Data#
Refer to the Grabbing Images section of the pylon C++ Programmer's Guide to get familiar with how to acquire data using the pylon API.
pylon represents grabbed images as data structures called GrabResults. GrabResults acquired by a Stereo ace camera contain multiple components. By default, each GrabResult stores an intensity image, a depth map, and additional information.
// Enable intensity image by enabling the Intensity component
camera.ComponentSelector.FromString("Intensity");
camera.ComponentEnable.SetValue(true);
Grabbing Images#
pylon supports different approaches for setting up a grab loop and provides different strategies for handling memory buffers.
Refer to the Grabbing Images section of the pylon C++ Programmer's Guide for more details.
The following code snippet illustrates a typical grab loop using the CStereoMiniInstantCamera class:
camera.StartGrabbing();
Pylon::CGrabResultPtr ptrGrabResult;
camera.RetrieveResult(5000, ptrGrabResult, Pylon::TimeoutHandling_ThrowException);
if (ptrGrabResult->GrabSucceeded())
{
const auto intensity = ptrGrabResult->GetDataComponent(Pylon::EComponentType::ComponentType_Intensity);
const auto range = ptrGrabResult->GetDataComponent(Pylon::EComponentType::ComponentType_Range);
// Process components
}
Accessing Components#
To access the individual components of a GrabResult, you can use the Pylon::CPylonDataContainer and Pylon::CPylonDataComponent classes. A container can hold one or more components. You can use the container to query for the number of components and to retrieve a specific component. Each component in the container holds the actual data, e.g., the depth values, as well as its metadata.
Use the Pylon::CGrabResultData::GetDataContainer() method to get access to a GrabResult's CPylonDataContainer. Use the Pylon::CPylonDataContainer::GetDataComponent() method to access a component by specifying the index of the component.
Refer to the Multi-Component Grab Results section in the the Advanced Topics chapter of the pylon C++ Programmer's Guide for more information about how pylon provides access to GrabResults containing multiple components.
Accessing Depth Data#
Stereo mini cameras provide depth data through the Range component. Depending on the selected pixel format, range data is available as:
Coord3D_C16for 16-bit depth valuesCoord3D_ABC32ffor direct 3D coordinates (X, Y, Zfloats)
Using the pylon API#
For more detailed information about using the pylon C++ API with Stereo mini cameras, refer to the pylon C++ Programmer's Guide.