Stream Grabber Parameters#
This topic describes the parameters related to the stream grabber.
AccessMode parameter indicates the mode of access the current application has to the device:
Control: The application has control access to the device. Other applications are still able to monitor the device and can request to take over control or gain exclusive access to the device.
Exclusive: The application has exclusive access to the device. No other application can control or monitor the device.
Monitor: The application has monitoring, i.e., read-only, access to the device.
NotInitialized: Access to the device hasn't been initialized.
This parameter is read-only.
Auto Packet Size#
AutoPacketSize parameter to optimize the size of the data packets transferred via Ethernet.
When the parameter is set to
true, the camera automatically negotiates the packet size to find the largest possible packet size.
To retrieve the current packet size, get the value of the
Using large packets reduces the overhead for transferring images. The maximum packet size depends on the network hardware and its configuration.
Maximum Buffer Size#
MaxBufferSize parameter to specify the maximum size (in bytes) of a buffer used for grabbing images.
A grab application must set this parameter before grabbing starts.
Maximum Number of Buffers#
MaxNumBuffer parameter to specify the maximum number of buffers that can be used simultaneously for grabbing images.
Maximum Transfer Size#
MaxTransferSize parameter to specify the maximum USB data transfer size in bytes. The default value is appropriate for most applications. Increase the value to lower the CPU load.
USB host adapter drivers may require decreasing the value if the application fails to receive the image stream. The maximum value depends on the operating system.
Num Max Queued URBs#
NumMaxQueuedUrbs parameter to specify the maximum number of USB request blocks (URBs) to be enqueued simultaneously.
Increasing this value may improve stability and reduce jitter, but requires more resources on the host computer.
Decreasing this value can be helpful if you get error messages related to insufficient system memory, e.g., "Failed to probe and lock buffer=0xe2010130" or "Failed to submit transfer status=0xe2100001".
Receive Thread Priority Override#
ReceiveThreadPriorityOverride parameter to enable assigning a custom priority to the thread which receives incoming stream packets. Only available if the socket driver is used.
To assign the priority, use the
Receive Thread Priority#
ReceiveThreadPriority parameter to set the thread priority of the receive thread. Only available if the socket driver is used.
To assign the priority, the
ReceiveThreadPriorityOverride parameter must be set to
Socket Buffer Size#
SocketBufferSize parameter to set the socket buffer size in kilobytes. Only available if the socket driver is used.
Status parameter indicates the current status of the stream grabber:
Closed: The stream grabber is closed.
Locked: The stream grabber is locked.
NotInitialized: The stream grabber is not initialized.
Open: The stream grabber is open.
This parameter is read-only.
Transfer Loop Thread Priority#
TransferLoopThreadPriority parameter to specify the priority of the threads that handle USB requests from the stream interface.
In pylon, there are two threads belonging to the USB transport layer, one for the image URBs (USB request blocks) and one for the event URBs. The transport layer enqueues the URBs to the xHCI driver and polls the bus for delivered URBs.
You can control the priority of both threads via the
On Windows, by default, the parameter is set to the following value:
- 25 if the host application is run with administrator privileges.
- 15 or lower if the host application is run without administrator privileges.
On Linux and macOS, the default parameter value and the parameter value range may differ.
The transfer loop priority should always be higher than the grab engine thread priority (
InternalGrabEngineThreadPriority parameter) and the grab loop thread priority (
For more information, see the C++ Programmer's Guide and Reference Documentation delivered with the Basler pylon Camera Software Suite ("Advanced Topics" -> "Application Settings for High Performance".
Type of GigE Vision Driver#
Type parameter to set the host application's GigE Vision driver type:
WindowsFilterDriver: The host application uses the pylon GigE Vision Filter Driver. This is a basic GigE Vision network driver that is compatible with all network adapters. The advantage of the filter driver is its extensive compatibility. This driver is available for Windows only.
WindowsPerformanceDriver: The host application uses the pylon GigE Vision Performance Driver. This is a hardware-specific GigE Vision network driver. The performance driver is only compatible with network adapters that use compatible chipsets. The advantage of the performance driver is that it significantly lowers the CPU load needed to service the network traffic between the computer and the camera(s). it also has a more robust packet resend mechanism. This driver is available for Windows only.
SocketDriver: The host application uses the socket driver. This is not a real driver. Instead, it uses the socket API of the respective operating system, e.g., Windows, Linux, or macOS, to communicate with cameras instead. The advantage of the socket driver is that it does not need any installation and is compatible with all network adapters. When using the socket driver, Basler recommends adjusting the network adapter settings (e.g., optimize the use of jumbo frames, receive descriptors, and interrupt moderation rate) as described in the Network Configuration topic.
NoDriverAvailable: No suitable driver is installed. The driver type can't be set.
Type: Socket Driver Available#
TypeIsSocketDriverAvailable parameter indicates whether the socket driver is currently available (1) or not available (0).
Type: Windows Filter Driver Available#
TypeIsWindowsFilterDriverAvailable parameter indicates whether the pylon GigE Vision Filter Driver is currently available (1) or not available (0).
Type: Windows Intel Performance Driver Available#
TypeIsWindowsIntelPerformanceDriverAvailable parameter indicates whether the pylon GigE Vision Performance Driver is currently available (1) or not available (0).
Packet Resend Mechanism Parameters#
The packet resend mechanism (GigE Vision only) optimizes the network performance by detecting and resending missing data packets.
In GigE Vision data transmission, each packet has a header consisting of an ascending 24-bit packet ID. This allows the receiving end to detect if a packet is missing.
You have to weigh the disadvantages and advantages for your special application to decide whether to enable or disable the mechanism:
- If enabled, the packet resend mechanism can cause delays because the driver waits for missing packets.
- If disabled, packets can get lost which results in image data loss.
The pylon GigE Vision Filter Driver and the Performance Driver use different packet resend mechanisms.
EnableResend parameter to enable the packet resend mechanism.
- If the parameter is set to
Typeparameter is set to
WindowsFilterDriver, the packet resend mechanism of the Filter Driver is enabled.
- If the parameter is set to
Typeparameter is set to
WindowsPerformanceDriver, the packet resend mechanism of the Performance Driver is enabled.
- If the parameter is set to
false, the packet resend mechanism is disabled.
Packet Resend Mechanism (Filter Driver)#
The pylon GigE Vision Filter Driver has a simple packet resend mechanism.
If the driver detects that packets are missing, it waits for a specified period of time. If the packets don't arrive within the time specified, the driver sends one resend request.
PacketTimeout parameter to specify how long (in milliseconds) the filter driver waits for the next expected packet before it initiates a resend request.
Make sure that the parameter is set to a longer time interval than the inter-packet delay.
FrameRetention parameter to specify the maximum time in milliseconds to receive all packets of a frame. The timer starts when the first packet has been received. If the transmission is not completed within the time specified, the corresponding frame is delivered with the status "Failed".
Packet Resend Mechanism (Performance Driver)#
The pylon GigE Vision Performance Driver has a more advanced packet resend mechanism.
It allows more fine-tuning. Also, the driver can send consecutive resend requests until a maximum number of requests has been reached.
Receive Window Size#
ReceiveWindowSize parameter to specify the size (in frames) of the "receive window" in which the stream grabber looks for missing packets.
Example: Assume the receive window size is set to 15. This means that the stream grabber looks for missing packets within the last 15 acquired frames.
The maximum value of the
ReceiveWindowSize parameter is 16. If the parameter is set to 0, the packet resend mechanism is disabled.
Resend Request Threshold#
ResendRequestThreshold parameter to set the threshold after which resend requests are initiated.
The parameter value is set in percent of the receive window size.
Example: Assume the receive window size is set to 15, and the resend request threshold is set to 33 %. This means that the threshold is set after 15 * 0.3333 = 5 frames.
In the example above, frames 99 and 100 are already within the receive window. The stream grabber detects missing packets in these frames. However, the stream grabber does not yet send a resend request.
Rather, the grabber waits until frame 99 has passed the threshold:
Now, the grabber sends resend requests for missing packets in frames 99 and 100.
Resend Request Batching#
ResendRequestBatching parameter to specify the amount of resend requests to be batched, i.e., sent together.
Example: Assume the receive window size is set to 15, the resend request threshold is set to 33 %, and the resend request batching is set to 80 %. This means that the batching is set to 15 * 0.33 * 0.8 = 4 frames.
In the example above, frame 99 has just passed the resend request threshold. The stream grabber looks for missing packets in the frames between the two thresholds and groups them.
Now, the stream grabber sends a single resend request for all missing packets in frames 99, 100, 101, and 102.
Maximum Number of Resend Requests#
MaximumNumberResendRequests parameter to specify the maximum number of resend requests per missing packet.
ResendTimeout parameter to specify how long (in milliseconds) the stream grabber waits between detecting a missing packet and sending a resend request.
Resend Request Response Timeout#
ResendRequestResponseTimeout parameter to specify how long (in milliseconds) the stream grabber waits between sending a resend request and considering the request as lost.
If a request is considered lost and the maximum number of resend requests hasn't been reached yet, the grabber sends another request.
If a request is considered lost and the maximum number of resend requests has been reached, the packet is considered lost.
Stream Destination Parameters#
The following parameters (GigE Vision only) allow you to configure where the stream grabber should send the grabbed data to.
The stream grabber can send the stream data to one specific device or to multiple devices in the network.
TransmissionType parameter to define how stream data is transferred within the network. You can set the parameter to the following values:
Unicast(default): The stream data is sent to a single device in the local network, usually the camera's GigE network adapter (see destination address). Other devices can't receive the stream data.
LimitedBroadcast: The stream data is sent to all devices in the local network (255.255.255.255), even if they aren't interested in receiving stream data. In large local networks, this uses a large amount of network bandwidth. To use this transmission type, you must set up the controlling and monitoring applications.
SubnetDirectedBroadcasting: The stream data is sent to all devices in the same subnet as the camera, even if they aren't interested in receiving stream data. If the subnet is small, this may save network bandwidth. Because devices outside the subnet can't receive the stream data, this transmission type can be useful, e.g., for security purposes.
For subnet-directed broadcasting, the stream grabber uses a subnet broadcast address. The subnet broadcast address is obtained by performing a bitwise OR between the camera's IP address and the bit complement of the camera's subnet mask (see destination address). To use this transmission type, you must set up the controlling and monitoring applications.
Multicast: The stream data is sent to selected devices in the local network. This saves network bandwidth because data is only sent to those devices that are interested in receiving the data. Also, you can specify precisely which devices you want to send the data to.
To use multicast, the stream destination address must be set to a multicast group address (184.108.40.206 to 220.127.116.11). Also, you must set up the controlling and monitoring applications. Then, the pylon API automatically takes care of creating and managing a multicast group that other devices can join.
UseCameraConfig: The stream transmission configuration is read from the camera. Use this option only if you want to set up a monitoring application.
Controlling and Monitoring Applications#
When using limited broadcast, subnet-directed broadcast, or multicast, you usually want to send the image data stream from one camera to multiple destinations.
To achieve this, you must set up exactly one controlling application and one or more monitoring applications.
- The controlling application starts and stops image acquisition. It can also change the camera configuration.
- The monitoring applications receive the stream data. Monitoring applications open the camera in read-only mode. This means that they can't start and stop image acquisition or change the camera configuration.
For testing purposes, you can use one instance of the pylon Viewer as the controlling application and another instance of the pylon Viewer as the monitoring application.
To use different instances of the pylon Viewer as controlling and monitoring applications:
- Start the pylon Viewer and open a GigE device.
- Start another instance of the pylon Viewer. This will act as the monitoring application:
- Windows: Start the pylon Viewer. In the Devices pane of the pylon Viewer, right-click the GigE device opened in step 1 and then click Open Device … > Monitor Mode.
- Linux: At the command line, type:
- macOS: At the command line, type:
./Applications/pylon Viewer.app/Contents/MacOS/pylon Viewer -m
For more information about setting up controlling and monitoring applications, see the C++ Programmer's Guide and Reference Documentation delivered with the Basler pylon Camera Software Suite ("Advanced Topics" -> "GigE Multicast/Broadcast").
DestinationAddr parameter indicates the IP address to which the stream grabber sends all stream data.
The value and the access mode of the parameter depend on the
TransmissionType parameter value:
|TransmissionType Parameter Value||DestinationAddr Parameter Value||DestinationAddr Access Mode|
| ||IP address of the camera's GigE network adapter||Read-only|
| ||(Camera's IP address) OR NOT (camera's subnet mask)||Read-only|
| ||Default: 18.104.22.168 |
Allowed range: 22.214.171.124 to 126.96.36.199a
Some addresses in this range are reserved. If you are unsure, use an address between 188.8.131.52 and 184.108.40.206. This range is assigned by RFC 2365 as a locally administered address space.
DestinationPort parameter indicates the port where the stream grabber will send all stream data to.
If the parameter is set to 0, pylon automatically selects an unused port.
For more information, see the C++ Programmer's Guide and Reference Documentation delivered with the Basler pylon Camera Software Suite ("Advanced Topics" -> "Selecting a Destination Port").
The pylon API provides statistics parameters that allow you to check whether your camera is set up correctly, your hardware components are appropriate, and your system performs well.
At camera startup, all statistics parameters are set to 0. While continuously grabbing images, the parameters are continuously updated to provide information about, e.g., lost images or buffers that were grabbed incompletely.
Buffer Underrun Count#
Statistic_Buffer_Underrun_Count parameter counts the number of frames lost because there were no buffers in the queue.
The parameter value increases whenever an image is received, but there are no queued, free buffers in the driver input queue and therefore the frame is lost.
Failed Buffer Count#
Statistic_Failed_Buffer_Count parameter counts the number of buffers that returned with status "failed", i.e., buffers that were grabbed incompletely.
The error code for incompletely grabbed buffers is 0xE1000014 on GigE cameras and 0xE2000212 on USB 3.0 cameras.
Failed Packet Count#
Statistic_Failed_Packet_Count parameter counts packets that were successfully received by the stream grabber, but have been reported as "failed" by the camera.
The most common reason for packets being reported as "failed" is that a packet resend request couldn't be satisfied by the camera. This occurs, e.g., if the requested data has already been overwritten by new image data inside the camera's memory.
The Failed Packet Count does not count packets that are considered lost because all resend requests have failed. In this case, the Failed Buffer Count will be increased, but not the Failed Packet Count.
Last Block ID#
Statistic_Last_Block_Id parameter indicates the last grabbed block ID.
Last Failed Buffer Status#
Statistic_Last_Failed_Buffer_Status parameter indicates the status code of the last failed buffer.
Last Failed Buffer Status Text#
Statistic_Last_Failed_Buffer_Status_Text parameter indicates the last error status of a read or write operation.
Missed Frame Count#
Statistic_Missed_Frame_Count parameter counts the number of frames that were acquired but skipped because the camera's internal frame buffer was already full. Basler USB 3.0 cameras are equipped with a frame buffer of 56 MB.
A high Missed Frame Count indicates that the xHCI host controller doesn't support the bandwidth of the camera, i.e., the host controller does not retrieve the acquired images on time. This causes the camera to buffer images in its internal frame buffer. When the internal frame buffer is full, the camera will start skipping newly acquired sensor data. For more information, see the USB 3.0 specification (Bulk Transaction type).
Resend Packet Count#
Statistic_Resend_Packet_Count parameter counts the number of packets requested by resend requests.
- If you are using the Filter Driver and the driver hasn't received the "leader" of a frame, i.e., the packet indicating the beginning of a frame, it will disregard the complete frame. No resend requests will be sent and no statistics parameters will be increased. This means that if the "leader" packet is lost, the complete frame will be lost without notice. Basler recommends checking the Frame Counter chunk to detect lost frames.
- If you are using the Performance Driver, the driver detects missing "leader" packets, sends resend requests, and adjusts the statistics parameters accordingly.
Resend Request Count#
Statistic_Resend_Request_Count parameter counts the number of packet resend requests sent.
Depending on the driver type and the stream grabber settings, the stream grabber may send multiple requests for one missing packet, or it may send one request for multiple packets. Therefore, the Resend Request Count and the Resend Packet Count will most likely be different.
Statistic_Resynchronization_Count parameter counts the number of stream resynchronizations.
If the host gets out of sync within the streaming process, it initiates a resynchronization, and the camera's internal buffer is flushed.
A host may get out of sync if it requests stream packets with a specific sequence of IDs, but the device delivers packets with a different sequence. This may occur when the connection between the camera and the host is faulty. A host being out of sync results in massive image loss.
A host resynchronization is considered the most serious error case in the USB 3.0 and USB3 Vision specification.
Total Buffer Count#
On GigE cameras, the
Statistic_Total_Buffer_Count parameter counts the number of buffers that returned with "success" or "failed" status, i.e., all successfully or incompletely grabbed buffers. On other cameras, e.g. USB cameras, the number of buffers processed is counted.
The error code for incompletely grabbed buffers is 0xE1000014 on GigE cameras and 0xE2000212 on USB 3.0 cameras.
Total Packet Count#
Statistic_Total_Packet_Count parameter counts all packets received, including packets that have been reported as "failed", i.e., including the Failed Packet Count.
// ** General Parameters ** // Access Mode AccessModeEnums accessMode = camera.GetStreamGrabberParams().AccessMode.GetValue(); // Auto Packet Size camera.GetStreamGrabberParams().AutoPacketSize.SetValue(true); // Maximum Buffer Size camera.GetStreamGrabberParams().MaxBufferSize.SetValue(131072); // Maximum Number of Buffers camera.GetStreamGrabberParams().MaxNumBuffer.SetValue(16); // Maximum Transfer Size camera.GetStreamGrabberParams().MaxTransferSize.SetValue(1048568); // Num Max Queued Urbs camera.GetStreamGrabberParams().NumMaxQueuedUrbs.SetValue(64); // Receive Thread Priority Override camera.GetStreamGrabberParams().ReceiveThreadPriorityOverride.SetValue(true); // Receive Thread Priority camera.GetStreamGrabberParams().ReceiveThreadPriority.SetValue(15); // Socket Buffer Size (socket driver only) camera.GetStreamGrabberParams().SocketBufferSize.SetValue(2048); // Status StatusEnums streamGrabberStatus = camera.GetStreamGrabberParams().Status.GetValue(); // Transfer Loop Thread Priority camera.GetStreamGrabberParams().TransferLoopThreadPriority.SetValue(15); // Type of GigE Vision Filter Driver camera.GetStreamGrabberParams().Type.SetValue(Type_WindowsIntelPerformanceDriver); // Type: Socket Driver Available int64_t i = camera.GetStreamGrabberParams().TypeIsWindowsIntelPerformanceDriverAvailable.GetValue(); // Type: Windows Filter Driver Available int64_t i = camera.GetStreamGrabberParams().TypeIsWindowsFilterDriverAvailable.GetValue(); // Type: Windows Intel Performance Driver Available int64_t i = camera.GetStreamGrabberParams().TypeIsSocketDriverAvailable.GetValue(); // ** Packet Resend Mechanism Parameters ** // Enable Resends camera.GetStreamGrabberParams().EnableResend.SetValue(true); // Packet Timeout (Filter Driver only) camera.GetStreamGrabberParams().PacketTimeout.SetValue(40); // Frame Retention (Filter Driver only) camera.GetStreamGrabberParams().FrameRetention.SetValue(200); // Receive Window Size (Performance Driver only) camera.GetStreamGrabberParams().ReceiveWindowSize.SetValue(16); // Resend Request Threshold (Performance Driver only) camera.GetStreamGrabberParams().ResendRequestThreshold.SetValue(5); // Resend Request Batching (Performance Driver only) camera.GetStreamGrabberParams().ResendRequestBatching.SetValue(10); // Maximum Number of Resend Requests (Performance Driver only) camera.GetStreamGrabberParams().MaximumNumberResendRequests.SetValue(25); // Resend Timeout (Performance Driver only) camera.GetStreamGrabberParams().ResendTimeout.SetValue(2); // Resend Request Response Timeout (Performance Driver only) camera.GetStreamGrabberParams().ResendRequestResponseTimeout.SetValue(2); // ** Stream Destination Parameters ** // Transmission Type camera.GetStreamGrabberParams().TransmissionType.SetValue(TransmissionType_Unicast); // Destination Address GenICam::gcstring destinationAddr = camera.GetStreamGrabberParams().DestinationAddr.GetValue(); // Destination Port camera.GetStreamGrabberParams().DestinationPort.SetValue(0); // ** Statistics Parameters ** // Buffer Underrun Count int64_t bufferUnderrunCount = camera.GetStreamGrabberParams().Statistic_Buffer_Underrun_Count.GetValue(); // Failed Buffer Count int64_t failedBufferCount = camera.GetStreamGrabberParams().Statistic_Failed_Buffer_Count.GetValue(); // Failed Packet Count int64_t failedPacketCount = camera.GetStreamGrabberParams().Statistic_Failed_Packet_Count.GetValue(); // Last Block ID int64_t lastBlockId = camera.GetStreamGrabberParams().Statistic_Last_Block_Id.GetValue(); // Last Failed Buffer Status Int64_t lastFailedBufferStatus = camera.GetStreamGrabberParams().Statistic_Last_Failed_Buffer_Status.GetValue(); // Last Failed Buffer Status Text GenICam::gcstring lastFailedBufferStatusText = camera.GetStreamGrabberParams().Statistic_Last_Failed_Buffer_Status_Text.GetValue(); // Missed Frame Count int64_t missedFrameCount = camera.GetStreamGrabberParams().Statistic_Missed_Frame_Count.GetValue(); // Resend Request Count int64_t resendRequestCount = camera.GetStreamGrabberParams().Statistic_Resend_Request_Count.GetValue(); // Resend Packet Count int64_t resendPacketCount = camera.GetStreamGrabberParams().Statistic_Resend_Packet_Count.GetValue(); // Resynchronization Count int64_t resynchronizationCount = camera.GetStreamGrabberParams().Statistic_Resynchronization_Count.GetValue(); // Total Buffer Count int64_t totalBufferCount = camera.GetStreamGrabberParams().Statistic_Total_Buffer_Count.GetValue(); // Total Packet Count int64_t totalPacketCount = camera.GetStreamGrabberParams().Statistic_Total_Packet_Count.GetValue();
// ** General Parameters ** // Access Mode string accessMode = camera.Parameters[PLStream.AccessMode].GetValue(); // Auto Packet Size camera.Parameters[PLStream.AutoPacketSize].SetValue(true); // Maximum Buffer Size camera.Parameters[PLStream.MaxBufferSize].SetValue(131072); // Maximum Number of Buffers camera.Parameters[PLStream.MaxNumBuffer].SetValue(16); // Maximum Transfer Size camera.Parameters[PLStream.MaxTransferSize].SetValue(1048568); // Num Max Queued Urbs camera.Parameters[PLStream.NumMaxQueuedUrbs].SetValue(64); // Receive Thread Priority Override camera.Parameters[PLStream.ReceiveThreadPriorityOverride].SetValue(true); // Receive Thread Priority camera.Parameters[PLStream.ReceiveThreadPriority].SetValue(15); // Socket Buffer Size (socket driver only) camera.Parameters[PLStream.SocketBufferSize].SetValue(2048); // Status string streamGrabberStatus = camera.Parameters[PLStream.Status].GetValue(); // Transfer Loop Thread Priority camera.Parameters[PLStream.TransferLoopThreadPriority].SetValue(15); // Type of GigE Vision Filter Driver camera.Parameters[PLStream.Type].SetValue(PLStream.Type.WindowsIntelPerformanceDriver); // Type: Socket Driver Available Int64 performanceDriverAvailable = camera.Parameters[PLStream.TypeIsWindowsIntelPerformanceDriverAvailable].GetValue(); // Type: Windows Filter Driver Available Int64 filterDriverAvailable = camera.Parameters[PLStream.TypeIsWindowsFilterDriverAvailable].GetValue(); // Type: Windows Intel Performance Driver Available Int64 socketDriverAvailable = camera.Parameters[PLStream.TypeIsSocketDriverAvailable].GetValue(); // ** Packet Resend Mechanism Parameters ** // Enable Resends camera.Parameters[PLStream.EnableResend].SetValue(true); // Packet Timeout (Filter Driver only) camera.Parameters[PLStream.PacketTimeout].SetValue(40); // Frame Retention (Filter Driver only) camera.Parameters[PLStream.FrameRetention].SetValue(200); // Receive Window Size (Performance Driver only) camera.Parameters[PLStream.ReceiveWindowSize].SetValue(16); // Resend Request Threshold (Performance Driver only) camera.Parameters[PLStream.ResendRequestThreshold].SetValue(5); // Resend Request Batching (Performance Driver only) camera.Parameters[PLStream.ResendRequestBatching].SetValue(10); // Maximum Number of Resend Requests (Performance Driver only) camera.Parameters[PLStream.MaximumNumberResendRequests].SetValue(25); // Resend Timeout (Performance Driver only) camera.Parameters[PLStream.ResendTimeout].SetValue(2); // Resend Request Response Timeout (Performance Driver only) camera.Parameters[PLStream.ResendRequestResponseTimeout].SetValue(2); // ** Stream Destination Parameters ** // Transmission Type camera.Parameters[PLStream.TransmissionType].SetValue(PLStream.TransmissionType.Unicast); // Destination Address string destinationAddr = camera.Parameters[PLStream.DestinationAddr].GetValue(); // Destination Port camera.Parameters[PLStream.DestinationPort].SetValue(0); // ** Statistics Parameters ** // Buffer Underrun Count Int64 bufferUnderrunCount = camera.Parameters[PLStream.Statistic_Buffer_Underrun_Count].GetValue(); // Failed Buffer Count Int64 failedBufferCount = camera.Parameters[PLStream.Statistic_Total_Buffer_Count].GetValue(); // Failed Packet Count Int64 failedPacketCount = camera.Parameters[PLStream.Statistic_Failed_Packet_Count].GetValue(); // Last Block ID Int64 lastBlockId = camera.Parameters[PLStream.Statistic_Last_Block_Id].GetValue(); // Last Failed Buffer Status Int64 lastFailedBufferStatus = camera.Parameters[PLStream.Statistic_Last_Failed_Buffer_Status].GetValue(); // Last Failed Buffer Status Text string lastFailedBufferStatusText = camera.Parameters[PLStream.Statistic_Last_Failed_Buffer_Status_Text].GetValue(); // Missed Frame Count Int64 missedFrameCount = camera.Parameters[PLStream.Statistic_Missed_Frame_Count].GetValue(); // Resend Packet Count Int64 resendPacketCount = camera.Parameters[PLStream.Statistic_Resend_Packet_Count].GetValue(); // Resend Request Count Int64 resendRequestCount = camera.Parameters[PLStream.Statistic_Resend_Request_Count].GetValue(); // Resynchronization Count Int64 resynchronizationCount = camera.Parameters[PLStream.Statistic_Resynchronization_Count].GetValue(); // Total Buffer Count Int64 totalBufferCount = camera.Parameters[PLStream.Statistic_Total_Buffer_Count].GetValue(); // Total Packet Count Int64 totalPacketCount = camera.Parameters[PLStream.Statistic_Total_Packet_Count].GetValue();
You can also use the pylon Viewer to easily set the parameters.