Stream Grabber Parameters#
This topic describes the parameters related to the stream grabber.
AccessMode parameter indicates the mode of access the current application has to the device:
Control: The application has control access to the device. Other applications are still able to monitor the device and can request to take over control or gain exclusive access to the device.
Exclusive: The application has exclusive access to the device. No other application can control or monitor the device.
Monitor: The application has monitoring, i.e., read-only, access to the device.
NotInitialized: Access to the device hasn't been initialized.
This parameter is read-only.
Auto Packet Size#
AutoPacketSize parameter to optimize the size of the data packets transferred via Ethernet.
When the parameter is set to
true, the camera automatically negotiates the packet size to find the largest possible packet size.
To retrieve the current packet size, get the value of the
Using large packets reduces the overhead for transferring images. The maximum packet size depends on the network hardware and its configuration.
Maximum Buffer Size#
MaxBufferSize parameter to specify the maximum size (in bytes) of a buffer used for grabbing images.
A grab application must set this parameter before grabbing starts.
Maximum Number of Buffers#
MaxNumBuffer parameter to specify the maximum number of buffers that can be used simultaneously for grabbing images.
Maximum Transfer Size#
MaxTransferSize parameter to specify the maximum USB data transfer size in bytes. The default value is appropriate for most applications. Increase the value to lower the CPU load.
USB host adapter drivers may require decreasing the value if the application fails to receive the image stream. The maximum value depends on the operating system.
Num Max Queued URBs#
NumMaxQueuedUrbs parameter to specify the maximum number of USB request blocks (URBs) to be enqueued simultaneously.
Increasing this value may improve stability and reduce jitter, but requires more resources on the host computer.
Decreasing this value can be helpful if you get error messages related to insufficient system memory, e.g., "Failed to probe and lock buffer=0xe2010130" or "Failed to submit transfer status=0xe2100001".
Receive Thread Priority Override#
ReceiveThreadPriorityOverride parameter to enable assigning a custom priority to the thread which receives incoming stream packets. Only available if the socket driver is used.
To assign the priority, use the
Receive Thread Priority#
ReceiveThreadPriority parameter to set the thread priority of the receive thread. Only available if the socket driver is used.
To assign the priority, the
ReceiveThreadPriorityOverride parameter must be set to
Socket Buffer Size#
SocketBufferSize parameter to set the socket buffer size in kilobytes. Only available if the socket driver is used.
Status parameter indicates the current status of the stream grabber:
Closed: The stream grabber is closed.
Locked: The stream grabber is locked.
NotInitialized: The stream grabber is not initialized.
Open: The stream grabber is open.
This parameter is read-only.
Transfer Loop Thread Priority#
TransferLoopThreadPriority parameter to specify the priority of the threads that handle USB requests from the stream interface.
In pylon, there are two threads belonging to the USB transport layer, one for the image URBs (USB request blocks) and one for the event URBs. The transport layer enqueues the URBs to the xHCI driver and polls the bus for delivered URBs.
You can control the priority of both threads via the
On Windows, by default, the parameter is set to the following value:
- 25 if the host application is run with administrator privileges.
- 15 or lower if the host application is run without administrator privileges.
On Linux and macOS, the default parameter value and the parameter value range may differ.
The transfer loop priority should always be higher than the grab engine thread priority (
InternalGrabEngineThreadPriority parameter) and the grab loop thread priority (
For more information, see the Application Settings for High Performance section in the pylon API Documentation.
Type of GigE Vision Driver#
Type parameter to set the host application's GigE Vision driver type:
WindowsFilterDriver: The host application uses the pylon GigE Vision Filter Driver.
WindowsPerformanceDriver(deprecated in pylon version 7.1): The host application uses the pylon GigE Vision Performance Driver.
SocketDriver: The host application uses the pylon GigE Vision Socket Driver.
NoDriverAvailable: No suitable driver is installed. The driver type can't be set.
For more information about the driver types, see Drivers.
Type: Socket Driver Available#
TypeIsSocketDriverAvailable parameter indicates whether the pylon GigE Vision Socket Driver is currently available (1) or not available (0).
Type: Windows Filter Driver Available#
TypeIsWindowsFilterDriverAvailable parameter indicates whether the pylon GigE Vision Filter Driver is currently available (1) or not available (0).
Type: Windows Intel Performance Driver Available#
TypeIsWindowsIntelPerformanceDriverAvailable parameter indicates whether the pylon GigE Vision Performance Driver is currently available (1) or not available (0).
Packet Resend Mechanism Parameters#
Packet resend mechanisms (GigE Vision only) optimize the network performance by detecting and resending missing data packets.
The pylon GigE Vision Filter and Socket Drivers have advanced and robust packet resend mechanisms.
They allow for fine-tuning and can send multiple consecutive resend requests until a maximum number of requests has been reached.
If the driver detects that packets (e.g., leader, payload, or trailer packets) are missing, it waits for a specified period of time. If the packets don't arrive within the time specified, the driver may send a single or multiple resend requests to try to retrieve a lost packet.
If a consecutive range of payload packets is missing, the driver will automatically send a single "batch resend request" for the range of missing packets. In addition, the driver may automatically send a resend request for a resend request that has been considered lost.
EnableResend parameter to enable or disable the packet resend mechanism for the currently selected type of GigE Vision driver.
PacketTimeout parameter to specify how long (in milliseconds) the filter driver waits for the next expected packet before it initiates a resend request.
Make sure that the parameter is set to a longer time interval than the inter-packet delay.
FrameRetention parameter to specify the maximum time in milliseconds to receive all packets of a frame. The timer starts when the first packet has been received. If the transmission is not completed within the time specified, the corresponding frame is delivered with the status "Failed".
Maximum Number of Resend Requests#
MaximumNumberResendRequests parameter to specify the maximum number of resend requests per missing packet.
Firewall Traversal Interval#
FirewallTraversalInterval parameter to prevent a firewall from blocking GigE Vision packets.
The parameter is available for both the stream grabber and the event grabber, i.e., for handling GigE Vision Streaming Protocol (GVSP) packets and Message Channel Source Port (MCSP) packets. It must be configured separately for both types of packets. For more information, see the code sample below.
If enabled, the grabber will send specific packets to simulate a traffic conversation that prevents firewall blocking.
By default, a packet will be sent every 10 seconds for GVSP packets and every 30 seconds for MCSP packets during a given streaming session.
You can set the parameter in milliseconds to optimize it for your firewall.
FirewallTraversalInterval parameter is set to zero, the Firewall Traversal feature is considered disabled.
Stream Destination Parameters#
The following parameters (GigE Vision only) allow you to configure where the stream grabber should send the grabbed data to.
The stream grabber can send the stream data to one specific device or to multiple devices in the network.
TransmissionType parameter to define how stream data is transferred within the network. You can set the parameter to the following values:
Unicast(default): The stream data is sent to a single device in the local network, usually the camera's GigE network adapter (see destination address). Other devices can't receive the stream data.
LimitedBroadcast: The stream data is sent to all devices in the local network (255.255.255.255), even if they aren't interested in receiving stream data. In large local networks, this uses a large amount of network bandwidth. To use this transmission type, you must set up the controlling and monitoring applications.
SubnetDirectedBroadcasting: The stream data is sent to all devices in the same subnet as the camera, even if they aren't interested in receiving stream data. If the subnet is small, this may save network bandwidth. Because devices outside the subnet can't receive the stream data, this transmission type can be useful, e.g., for security purposes.
For subnet-directed broadcasting, the stream grabber uses a subnet broadcast address. The subnet broadcast address is obtained by performing a bitwise OR between the camera's IP address and the bit complement of the camera's subnet mask (see destination address). To use this transmission type, you must set up the controlling and monitoring applications.
- To set the camera's IP address and subnet mask, use the pylon IP Configurator.
- For more information about IP addresses, subnet masks, and subnet broadcast addresses, visit the Online IP Subnet Calculator website.
Multicast: The stream data is sent to selected devices in the local network. This saves network bandwidth because data is only sent to those devices that are interested in receiving the data. Also, you can specify precisely which devices you want to send the data to.
To use multicast, the stream destination address must be set to a multicast group address (18.104.22.168 to 22.214.171.124). Also, you must set up the controlling and monitoring applications. Then, the pylon API automatically takes care of creating and managing a multicast group that other devices can join. -
UseCameraConfig: The stream transmission configuration is read from the camera. Use this option only if you want to set up a monitoring application.
Controlling and Monitoring Applications#
When using limited broadcast, subnet-directed broadcast, or multicast, you usually want to send the image data stream from one camera to multiple destinations.
To achieve this, you must set up exactly one controlling application and one or more monitoring applications.
- The controlling application starts and stops image acquisition. It can also change the camera configuration.
- The monitoring applications receive the stream data. Monitoring applications open the camera in read-only mode. This means that they can't start and stop image acquisition or change the camera configuration.
For testing purposes, you can use one instance of the pylon Viewer as the controlling application and another instance of the pylon Viewer as the monitoring application.
To use different instances of the pylon Viewer as controlling and monitoring applications:
- Start the pylon Viewer and open a GigE device.
- Start another instance of the pylon Viewer. This will act as the monitoring application:
- Windows: Start the pylon Viewer. In the Devices pane of the pylon Viewer, right-click the GigE device opened in step 1 and then click Open Device … > Monitor Mode.
- Linux: At the command line, type:
- macOS: At the command line, type:
./Applications/pylon Viewer.app/Contents/MacOS/pylon Viewer -m
For more information about setting up controlling and monitoring applications, see the GigE Multicast/Broadcast section in the pylon API Documentation.
DestinationAddr parameter indicates the IP address to which the stream grabber sends all stream data.
The value and the access mode of the parameter depend on the
TransmissionType parameter value:
|TransmissionType Parameter Value||DestinationAddr Parameter Value||DestinationAddr Access Mode|
| ||IP address of the camera's GigE network adapter||Read-only|
| ||(Camera's IP address) OR NOT (camera's subnet mask)||Read-only|
| ||Default: 126.96.36.199 |
Allowed range: 188.8.131.52 to 184.108.40.206a
Some addresses in this range are reserved. If you are unsure, use an address between 220.127.116.11 and 18.104.22.168. This range is assigned by RFC 2365 as a locally administered address space.
DestinationPort parameter indicates the port where the stream grabber will send all stream data to.
If the parameter is set to 0, pylon automatically selects an unused port.
For more information, see the Selecting a Destination Port section in the pylon API Documentation.
The pylon API provides statistics parameters that allow you to check whether your camera is set up correctly, your hardware components are appropriate, and your system performs well.
At camera startup, all statistics parameters are set to 0. While continuously grabbing images, the parameters are continuously updated to provide information about, e.g., lost images or buffers that were grabbed incompletely.
Buffer Underrun Count#
Statistic_Buffer_Underrun_Count parameter counts the number of frames lost because there were no buffers in the queue.
The parameter value increases whenever an image is received, but there are no queued, free buffers in the driver input queue and therefore the frame is lost.
Failed Buffer Count#
Statistic_Failed_Buffer_Count parameter counts the number of buffers that returned with status "failed", i.e., buffers that were grabbed incompletely.
The error code for incompletely grabbed buffers is 0xE1000014 on GigE cameras and 0xE2000212 on USB 3.0 cameras.
Failed Packet Count#
Statistic_Failed_Packet_Count parameter counts packets that were successfully received by the stream grabber, but have been reported as "failed" by the camera.
The most common reason for packets being reported as "failed" is that a packet resend request couldn't be satisfied by the camera. This occurs, e.g., if the requested data has already been overwritten by new image data inside the camera's memory.
The Failed Packet Count does not count packets that are considered lost because all resend requests have failed. In this case, the Failed Buffer Count will be increased, but not the Failed Packet Count.
Last Block ID#
Statistic_Last_Block_Id parameter indicates the last grabbed block ID.
Last Failed Buffer Status#
Statistic_Last_Failed_Buffer_Status parameter indicates the status code of the last failed buffer.
Last Failed Buffer Status Text#
Statistic_Last_Failed_Buffer_Status_Text parameter indicates the last error status of a read or write operation.
Missed Frame Count#
Statistic_Missed_Frame_Count parameter counts the number of frames that were acquired but skipped because the camera's internal frame buffer was already full.
Many Basler cameras are equipped with a frame buffer that is able to store several complete frames. A high Missed Frame Count indicates that the host controller doesn't support the bandwidth of the camera, i.e., the host controller does not retrieve the acquired images in time. This causes the camera to buffer images in its internal frame buffer. When the internal frame buffer is full, the camera will start skipping newly acquired sensor data.
Resend Packet Count#
Statistic_Resend_Packet_Count parameter counts the number of packets requested by resend requests.
If you are using the Filter Driver and the driver hasn't received the "leader" of a frame, i.e., the packet indicating the beginning of a frame, it will disregard the complete frame. No resend requests will be sent and no statistics parameters will be increased. This means that if the "leader" packet is lost, the complete frame will be lost without notice. Basler recommends checking the Frame Counter chunk to detect lost frames.
Resend Request Count#
Statistic_Resend_Request_Count parameter counts the number of packet resend requests sent.
Depending on the driver type and the stream grabber settings, the stream grabber may send multiple requests for one missing packet, or it may send one request for multiple packets. Therefore, the Resend Request Count and the Resend Packet Count will most likely be different.
Statistic_Resynchronization_Count parameter counts the number of stream resynchronizations.
If the host gets out of sync within the streaming process, it initiates a resynchronization, and the camera's internal buffer is flushed.
A host may get out of sync if it requests stream packets with a specific sequence of IDs, but the device delivers packets with a different sequence. This may occur when the connection between the camera and the host is faulty. A host being out of sync results in massive image loss.
A host resynchronization is considered the most serious error case in the USB 3.0 and USB3 Vision specification.
Total Buffer Count#
On GigE cameras, the
Statistic_Total_Buffer_Count parameter counts the number of buffers that returned with "success" or "failed" status, i.e., all successfully or incompletely grabbed buffers. On other cameras, e.g. USB cameras, the number of buffers processed is counted.
The error code for incompletely grabbed buffers is 0xE1000014 on GigE cameras and 0xE2000212 on USB 3.0 cameras.
Total Packet Count#
Statistic_Total_Packet_Count parameter counts all packets received, including packets that have been reported as "failed", i.e., including the Failed Packet Count.
// ** General Parameters ** // Access Mode AccessModeEnums accessMode = camera.GetStreamGrabberParams().AccessMode.GetValue(); // Auto Packet Size camera.GetStreamGrabberParams().AutoPacketSize.SetValue(true); // Maximum Buffer Size camera.GetStreamGrabberParams().MaxBufferSize.SetValue(131072); // Maximum Number of Buffers camera.GetStreamGrabberParams().MaxNumBuffer.SetValue(16); // Maximum Transfer Size camera.GetStreamGrabberParams().MaxTransferSize.SetValue(1048568); // Num Max Queued Urbs camera.GetStreamGrabberParams().NumMaxQueuedUrbs.SetValue(64); // Receive Thread Priority Override camera.GetStreamGrabberParams().ReceiveThreadPriorityOverride.SetValue(true); // Receive Thread Priority camera.GetStreamGrabberParams().ReceiveThreadPriority.SetValue(15); // Socket Buffer Size (socket driver only) camera.GetStreamGrabberParams().SocketBufferSize.SetValue(2048); // Status StatusEnums streamGrabberStatus = camera.GetStreamGrabberParams().Status.GetValue(); // Transfer Loop Thread Priority camera.GetStreamGrabberParams().TransferLoopThreadPriority.SetValue(15); // Type of GigE Vision Filter Driver camera.GetStreamGrabberParams().Type.SetValue(Type_WindowsIntelPerformanceDriver); // Type: Socket Driver Available int64_t i = camera.GetStreamGrabberParams().TypeIsWindowsIntelPerformanceDriverAvailable.GetValue(); // Type: Windows Filter Driver Available int64_t i = camera.GetStreamGrabberParams().TypeIsWindowsFilterDriverAvailable.GetValue(); // Type: Windows Intel Performance Driver Available int64_t i = camera.GetStreamGrabberParams().TypeIsSocketDriverAvailable.GetValue(); // ** Packet Resend Mechanism Parameters ** // Enable Resends camera.GetStreamGrabberParams().EnableResend.SetValue(true); // Packet Timeout camera.GetStreamGrabberParams().PacketTimeout.SetValue(40); // Frame Retention camera.GetStreamGrabberParams().FrameRetention.SetValue(200); // Maximum Number of Resend Requests camera.GetStreamGrabberParams().MaximumNumberResendRequests.SetValue(25); // Firewall Traversal Interval camera.GetStreamGrabberParams().FirewallTraversalInterval.SetValue(10000); camera.GetEventGrabberParams().FirewallTraversalInterval.SetValue(30000); // ** Stream Destination Parameters ** // Transmission Type camera.GetStreamGrabberParams().TransmissionType.SetValue(TransmissionType_Unicast); // Destination Address GenICam::gcstring destinationAddr = camera.GetStreamGrabberParams().DestinationAddr.GetValue(); // Destination Port camera.GetStreamGrabberParams().DestinationPort.SetValue(0); // ** Statistics Parameters ** // Buffer Underrun Count int64_t bufferUnderrunCount = camera.GetStreamGrabberParams().Statistic_Buffer_Underrun_Count.GetValue(); // Failed Buffer Count int64_t failedBufferCount = camera.GetStreamGrabberParams().Statistic_Failed_Buffer_Count.GetValue(); // Failed Packet Count int64_t failedPacketCount = camera.GetStreamGrabberParams().Statistic_Failed_Packet_Count.GetValue(); // Last Block ID int64_t lastBlockId = camera.GetStreamGrabberParams().Statistic_Last_Block_Id.GetValue(); // Last Failed Buffer Status int64_t lastFailedBufferStatus = camera.GetStreamGrabberParams().Statistic_Last_Failed_Buffer_Status.GetValue(); // Last Failed Buffer Status Text GenICam::gcstring lastFailedBufferStatusText = camera.GetStreamGrabberParams().Statistic_Last_Failed_Buffer_Status_Text.GetValue(); // Missed Frame Count int64_t missedFrameCount = camera.GetStreamGrabberParams().Statistic_Missed_Frame_Count.GetValue(); // Resend Request Count int64_t resendRequestCount = camera.GetStreamGrabberParams().Statistic_Resend_Request_Count.GetValue(); // Resend Packet Count int64_t resendPacketCount = camera.GetStreamGrabberParams().Statistic_Resend_Packet_Count.GetValue(); // Resynchronization Count int64_t resynchronizationCount = camera.GetStreamGrabberParams().Statistic_Resynchronization_Count.GetValue(); // Total Buffer Count int64_t totalBufferCount = camera.GetStreamGrabberParams().Statistic_Total_Buffer_Count.GetValue(); // Total Packet Count int64_t totalPacketCount = camera.GetStreamGrabberParams().Statistic_Total_Packet_Count.GetValue();
// ** General Parameters ** // Access Mode string accessMode = camera.Parameters[PLStream.AccessMode].GetValue(); // Auto Packet Size camera.Parameters[PLStream.AutoPacketSize].SetValue(true); // Maximum Buffer Size camera.Parameters[PLStream.MaxBufferSize].SetValue(131072); // Maximum Number of Buffers camera.Parameters[PLStream.MaxNumBuffer].SetValue(16); // Maximum Transfer Size camera.Parameters[PLStream.MaxTransferSize].SetValue(1048568); // Num Max Queued Urbs camera.Parameters[PLStream.NumMaxQueuedUrbs].SetValue(64); // Receive Thread Priority Override camera.Parameters[PLStream.ReceiveThreadPriorityOverride].SetValue(true); // Receive Thread Priority camera.Parameters[PLStream.ReceiveThreadPriority].SetValue(15); // Socket Buffer Size (socket driver only) camera.Parameters[PLStream.SocketBufferSize].SetValue(2048); // Status string streamGrabberStatus = camera.Parameters[PLStream.Status].GetValue(); // Transfer Loop Thread Priority camera.Parameters[PLStream.TransferLoopThreadPriority].SetValue(15); // Type of GigE Vision Filter Driver camera.Parameters[PLStream.Type].SetValue(PLStream.Type.WindowsIntelPerformanceDriver); // Type: Socket Driver Available Int64 performanceDriverAvailable = camera.Parameters[PLStream.TypeIsWindowsIntelPerformanceDriverAvailable].GetValue(); // Type: Windows Filter Driver Available Int64 filterDriverAvailable = camera.Parameters[PLStream.TypeIsWindowsFilterDriverAvailable].GetValue(); // Type: Windows Intel Performance Driver Available Int64 socketDriverAvailable = camera.Parameters[PLStream.TypeIsSocketDriverAvailable].GetValue(); // ** Packet Resend Mechanism Parameters ** // Enable Resends camera.Parameters[PLStream.EnableResend].SetValue(true); // Packet Timeout camera.Parameters[PLStream.PacketTimeout].SetValue(40); // Frame Retention camera.Parameters[PLStream.FrameRetention].SetValue(200); // Maximum Number of Resend Requests camera.Parameters[PLStream.MaximumNumberResendRequests].SetValue(25); // Firewall Traversal Interval camera.Parameters[PLStream.FirewallTraversalInterval].SetValue(10000); camera.Parameters[PLEventGrabber.FirewallTraversalInterval].SetValue(30000); // ** Stream Destination Parameters ** // Transmission Type camera.Parameters[PLStream.TransmissionType].SetValue(PLStream.TransmissionType.Unicast); // Destination Address string destinationAddr = camera.Parameters[PLStream.DestinationAddr].GetValue(); // Destination Port camera.Parameters[PLStream.DestinationPort].SetValue(0); // ** Statistics Parameters ** // Buffer Underrun Count Int64 bufferUnderrunCount = camera.Parameters[PLStream.Statistic_Buffer_Underrun_Count].GetValue(); // Failed Buffer Count Int64 failedBufferCount = camera.Parameters[PLStream.Statistic_Total_Buffer_Count].GetValue(); // Failed Packet Count Int64 failedPacketCount = camera.Parameters[PLStream.Statistic_Failed_Packet_Count].GetValue(); // Last Block ID Int64 lastBlockId = camera.Parameters[PLStream.Statistic_Last_Block_Id].GetValue(); // Last Failed Buffer Status Int64 lastFailedBufferStatus = camera.Parameters[PLStream.Statistic_Last_Failed_Buffer_Status].GetValue(); // Last Failed Buffer Status Text string lastFailedBufferStatusText = camera.Parameters[PLStream.Statistic_Last_Failed_Buffer_Status_Text].GetValue(); // Missed Frame Count Int64 missedFrameCount = camera.Parameters[PLStream.Statistic_Missed_Frame_Count].GetValue(); // Resend Packet Count Int64 resendPacketCount = camera.Parameters[PLStream.Statistic_Resend_Packet_Count].GetValue(); // Resend Request Count Int64 resendRequestCount = camera.Parameters[PLStream.Statistic_Resend_Request_Count].GetValue(); // Resynchronization Count Int64 resynchronizationCount = camera.Parameters[PLStream.Statistic_Resynchronization_Count].GetValue(); // Total Buffer Count Int64 totalBufferCount = camera.Parameters[PLStream.Statistic_Total_Buffer_Count].GetValue(); // Total Packet Count Int64 totalPacketCount = camera.Parameters[PLStream.Statistic_Total_Packet_Count].GetValue();
This sample code is only available in C++ and C# languages.
You can also use the pylon Viewer to easily set the parameters.