How to program our industrial UVC camera using Python

Python is a programming language known for its simplicity, rich libraries, cross-platform compatibility, open-source nature, and versatility. Here are some key features:

1. Simple Syntax: Python’s syntax is straightforward and easy to read, making it accessible even to programming beginners.

2. Rich Libraries: Python boasts a vast array of libraries that enable easy execution of various tasks across different domains.

3. Cross-Platform: Python runs on various platforms, including Windows, Linux, and macOS, ensuring broad compatibility.

4. Open Source: Python is open-source and freely available for anyone to use, contributing to its widespread adoption.

5. Versatile Use: Python is suitable for a wide range of applications, including web development, data analysis, machine learning, artificial intelligence, automation, and game development.

Given these characteristics, Python is widely used by both beginners and professionals for various development purposes.

To utilize our UVC camera and capture video using Python, you can execute the following code:

How to capture long-duration videos using a high-definition camera

When it comes to recording long-duration videos lasting several hours with a high-definition camera, there are primarily two approaches: using a computer (PC) and not using a computer.

Using a computer allows direct recording of videos onto the PC’s storage, facilitating extended recording sessions. For details on how to utilize a computer for this purpose, please refer to the following page.

 

If you’re not using a PC, another option is to use our HDMI recorder with an external HDD connected to it.

 

You can achieve long-duration recording by connecting a high-capacity external HDD (up to 1TB) instead of a USB flash drive.

Please note that externally powered HDDs are required as bus-powered external HDDs will not function properly.

 

Moreover, ensure that the HDD is formatted in NTFS format for recording long videos as FAT32 has a file size limit of 4GB per file. exFAT format is not supported for this purpose.

If you want to use an industrial camera on Linux

■ The Genesis of Linux
Linux, unlike Windows, comes in various distributions.

Linux distributions can be developed and distributed freely by anyone, allowing individuals, small groups, and even corporations to provide distributions tailored for different hardware and purposes.

Popular distributions for personal computers and servers include “Debian GNU/Linux,” “Ubuntu,” “CentOS,” “Raspbian,” and “Fedora.”

However, not all applications are compatible with every distribution.

Below, we outline the process of using our cameras with Linux.

■ Using Industrial UVC Cameras
UVC cameras are compatible with many distributions.

Many Linux distributions already have UVC camera drivers built-in.

Additionally, the OpenCV image processing library supports UVC cameras, making it much easier to program compared to industrial USB3 Vision / GigE Vision cameras.

Traditionally, UVC cameras have been associated with built-in lenses, mass production, and low-cost sales akin to webcams. However, our company offers cameras with external terminals (trigger terminals) and interchangeable C-mount lenses.

How to choose uvc camera

At our company, we handle two main types of UVC cameras.

The first type is the DN series.

DN series cameras are considered higher-end compared to typical UVC cameras. There are two main points that distinguish them:

The first point is the presence of trigger terminals. While UVC cameras with trigger terminals are rare, they are common in industrial USB cameras. This feature was added to meet the demand for using UVC cameras in industrial settings.

The second point is the inclusion of a tool program that remembers camera settings such as white balance. UVC cameras typically have automatic brightness and color adjustment settings, which cannot be manually adjusted. However, with the tool program included with DN series cameras, adjustments can be made and saved.

While iControl operates correctly, the camera cannot be found from the application software.

If experiencing this issue on a 64-bit version of Windows, try the following steps with an account that has administrative privileges:

Download the file from the following link: VC_redist.x64.exe.
Run the downloaded VC_redist.x64.exe file. Check “I agree” on the initial screen and click “Install.” No need to check “Send feedback.” Finally, click “Finish” to exit.
Right-click on the “Windows” icon at the bottom left of the Windows screen, then click on “Command Prompt (Administrator).” For Windows 7, navigate to “All Programs” > “Accessories” > right-click on “Command Prompt” and select “Run as administrator.”
Click “Yes” if prompted by the User Account Control.
In the black window that appears, type the following line and press Enter to execute:

regsvr32 netccam64.ax

After seeing the confirmation message “DllRegisterServer in netccam64.ax succeeded,” click “OK,” and then close the black window.
Restart your computer.

In an account without administrative privileges, the camera may not be detected by software using DirectShow.

If the driver setup was performed without administrative privileges, the DirectShow source filter may not be enabled.

To enable the DirectShow source filter after setting up the driver without administrative privileges, follow these steps:

1) Sign in again with an account that has administrative privileges.

2) Connect the camera and wait for the driver to be installed. It’s crucial to connect the camera at least once with an account that has administrative privileges after setting up the driver.

3) Restart Windows.

4) After restarting, sign in again with an account that has administrative privileges. At this point, the DirectShow source filter should be enabled.

After following these steps, the DirectShow source filter should be available even with an account that doesn’t have administrative privileges.

HIGH SPEED CAMERA LOW PRICE (Color / Monochrome) CHU30-C-RS / CHU30-B-RS

Instant slow motion observation is what you want to observe.
You may use the trigger to capture anything you desire.

 

 

●Improve the usability of high-speed cameras! Two software as standard equipment!

●High-speed camera at an extremely low price

●800 fps (fps) at 640 x 480 (VGA) effective pixels,
Also, at 640 x 360 resolution, the maximum speed is 1000 fps (fps).

If color isn’t essential to you, go with the Color Camera option.
 If color is not critical but more light is required, a monochrome camera should be used.

 

The color observed through the microscope does not match the color of the camera image.

When the colors observed through the microscope do not match the colors in the camera image, white balance correction is necessary. However, sometimes the correction using the One Push button may not produce satisfactory results, especially when the light source of the microscope has a tint of orange or yellow.

This discrepancy occurs because the One Push button attempts to correct the color temperature of the light source to white, sometimes excessively. In such cases, manual correction is required to adjust the colors accurately.

 

Please follow the steps below to adjust the colors:

 

1) Reset all settings of the camera to default.

2) Set up the microscope with only the light source or a prepared slide without any specimen. Adjust the brightness on the microscope to ensure it’s not overexposed.

3) Click the One Push button in the camera properties. This will make the camera image appear white.

4) Manually adjust the Red/Blue/Green values in the camera properties to match the colors observed through the microscope as closely as possible.

I would like to connect and use four USB 2.0 5-megapixel cameras simultaneously.

It is possible to connect four USB 2.0 5-megapixel cameras simultaneously, such as the DN2R-500. However, there may be limitations on frame rates and other factors depending on the application.

Primarily, bandwidth considerations should be taken into account, followed by an evaluation of the power and compatibility between the PC’s CPU, memory, USB host controller performance, and the load of all four cameras.

For instance, if one 5-megapixel camera operates at 5fps, the bandwidth required is approximately half of the USB 2.0 host controller’s capacity, assuming a theoretical maximum of 500Mbps. In practice, the effective bandwidth is around 360Mbps, so the usage is already over half of the available bandwidth.

Operating at 1fps or having four USB host controllers might make it feasible, but the actual performance depends on the PC. Therefore, it’s essential to evaluate the setup practically.

For simultaneous operation of four cameras, using USB 3.0 would be preferable. With USB 3.0, a single host controller can provide 5Gbps (5000M), allowing sufficient bandwidth for simultaneous connections, even with a hub.

Though consumer-grade hubs are prevalent, they may introduce uncertainties. For industrial applications, PCIe boards with multiple USB 3.0 ports are commonly used. These boards provide a stable and reliable solution, albeit at a higher cost than consumer-grade alternatives.

For instance, using a PCIe board with four USB 3.0 ports is a common approach. While consumer-grade options are available, industrial-grade boards from companies like AVAL DATA offer long-term supply and stability but come at a higher price.

Utilizing motherboards with built-in USB 3.0 ports is another option. However, performance-wise, a four-channel PCIe board is the most stable solution.

Connecting and processing images from four cameras can significantly strain the PC’s resources, so the intended usage scenario should also be considered. For instance, USB 3 Vision models impose the lowest CPU load, followed by USB 3.0 and then USB 2.0 models, respectively.

Is it feasible to capture images at a rate of 6 frames per second with an external trigger (a 6Hz pulse signal) on a USB 3.0 1.3-megapixel monochrome camera?

Utilizing an external trigger to capture 6 frames per second with the DN3R-500BU, a USB 3.0 monochrome camera with a resolution of 5 megapixels, poses no issues whatsoever. Our USB 2.0 and USB 3.0 cameras share common trigger/strobe specifications. Please refer to the documentation and the FAQ below for assistance when using an external trigger.

 

The voltage specification for the external trigger is 3.3V to 24V, with many customers typically using 5V. Please ensure that the input pulse width is set to 5.1μs or greater.

 

The delay for the external trigger, referring to the time between the camera receiving the trigger and the start of exposure, is typically in the range of a few microseconds (μs). If intentional delay is desired, it can be achieved by setting the Trigger Delay using the SDK.

 

When viewing the camera from the rear, Pin1 is located on the left side, and Pin8 on the right side. Please refer to the attached manual for details. When connecting the short wire included with the camera, it will be connected as follows: Pin1 (white, black, red, orange, yellow, black, white, black) to Pin8. For using an external trigger, utilize Pin1 and Pin2.

The dynamic range supported by a USB 3.0 monochrome camera with a resolution of 2 million pixels varies depending on the model.

The term “dynamic range” refers to the value representing the range of white to black that a single pixel can express. In the case of 8 bits, there are 256 levels, allowing a single pixel to express 256 shades of gray. With 12 bits, a single pixel can represent 4096 levels, enabling finer gray-scale representation, albeit with increased data volume. The dynamic range depends on the specifications of the image sensor being used, rather than the software or PC.

Now, the USB 3.0 2-megapixel monochrome camera (DN3RG-200BU) defaults to sending 8-bit data. Therefore, the 47fps listed on the HP website pertains to the fps at 8 bits. For the DN3RG-200BU, both 8-bit (default) and 10-bit (transferred as 16-bit) modes achieve 47fps.

Dynamic range varies depending on the model. The mention of transferring as 16-bit is due to the necessity of transferring images in either 8-bit or 16-bit format, where, despite technically having a 10-bit dynamic range, 6 bits are allocated to color, effectively resulting in a practical dynamic range of 10 bits.

 

The cameras operating in either 8-bit or 10-bit mode include:

– DN3G-30BU (USB3.0, 300,000-pixel monochrome)
– DN3RG-130BU (USB3.0, 1.3-megapixel monochrome)
– DN3RG-200BU (USB3.0, 2-megapixel monochrome)

The cameras operating in either 8-bit or 12-bit mode include:

– DN3R-500BU (USB3.0, 5-megapixel monochrome)
– DN3R-1000BU (USB3.0, 10-megapixel monochrome)

All USB2.0 cameras operate solely in 8-bit mode.

What is the method for distinguishing whether a frame is a GoodFrame or a BadFrame in a callback?

The determination is made based on the lBufferSize parameter passed to the callback function.

 

long CALLBACK CallbackFunc(BYTE * pBuffer, long lBufferSize,PVOID pContext )

 

When lBufferSize == 0, it indicates a BadFrame.

When lBufferSize != 0, it indicates a GoodFrame.

 

Judgment should not rely on pBuffer.

 

pBuffer == NULL signifies that memory for the frame is not allocated. While it will not be NULL during a GoodFrame, its value during a BadFrame is indeterminate.

 

To ensure that the callback is invoked even during BadFrame occurrences, the following code must be executed at the appropriate initialization stage:

 

ICubeSDK_SetCamParameter( n, REG_CALLBACK_BR_FRAMES, ON );  //n is Camera code

 

The ON parameter in the last argument enables invoking the callback during BadFrame occurrences. When set to OFF, the callback is only triggered during GoodFrame occurrences.

how you can use a DN camera with LabVIEW

When using our DN series camera with LabView, it is necessary to utilize NI-IMAQ. If you do not have NI-IMAQ available, we kindly request that you acquire it.

Furthermore, it is necessary to switch the camera to 3Vision mode. Please note that when provided by us, it is delivered in SDK mode.

Please refer to the following steps for switching modes.

 

 

What is ISO sensitivity ?

  1. What is ISO sensitivity ?

ISO sensitivity refers to the term used in the era of film cameras, where sensitivity was adjusted by changing the film. For example, if one wanted to increase shutter speed but lower sensitivity, they would switch to a film with a higher ISO sensitivity.

 

2.   

 

Even though film cameras are no longer prevalent, some digital cameras, such as compact digital cameras and digital single-lens reflex cameras, continue to offer ISO sensitivity settings. This is due to the background belief that for users accustomed to ISO sensitivity, expressing sensitivity settings in ISO values would be more intuitive and understandable.

 

3.  

 

However, in today’s industrial cameras, there are few that utilize ISO notation for sensitivity settings. Our company’s cameras also do not allow sensitivity settings in ISO notation.

While ISO sensitivity is not available, there are several methods to adjust brightness.

 

4. 

Brightness in industrial cameras is adjusted using the following methods, although not through ISO sensitivity:

– Changing the aperture of the lens (generally, larger aperture means brighter)
– Adjusting the aperture of the lens
– Adjusting with lighting (using lighting fixtures to adjust brightness, applying strong strobe light to increase brightness, etc.)
– Adjusting exposure time of the camera (increasing it makes the image brighter)
– Adjusting gain (increasing it makes the image brighter)

Note: Increasing gain too much deteriorates the signal-to-noise ratio, making grainy noise more prominent.

For April 2024 HOLIDAY NOTICE

Dear Valued Customers, Business Associates, and Partners,

Shodensha Vietnam is pleased to announce our schedule of April holiday 2024 as follow:

– Hung King Commemoration Day on April 17, 2024 (1 day)

– Liberation Day & May Day from Monday, 29th April 2024 to Wednesday, 1 May 2024

– Business operation will resume as normal on 2th May 2024(Thursday).

 Wishing you and your family a happy holiday.

MvCameraControl.dll is not found in the Python sample program for CS/EG series cameras

When executing the Python program of CS/EG series cameras, the following error message may be displayed and cannot be executed.

 

FileNotFoundError: Could not find module ‘MvCameraControl.dll’

 

This is due to the fact that the folder where the MvCameraControl.dll resides is not passed through the Windows path.

 

MvCamCtrldll = WinDLL(“MvCameraControl.dll”)

 

In the case of Windows, adding the path to the MvCameraControl.dll to the environment variable solves the problem, but it is easier to change the line where the error occurs as follows.

 

・Changes for 32-bit environments

MvCamCtrldll = WinDLL(“C:\Program Files (x86)\Common Files\MVS\Runtime\Win32_i86\ MvCameraControl.dll”)

 

 

・Changes for 64-bit environments

MvCamCtrldll = WinDLL(“C:\Program Files (x86)\Common Files\MVS\Runtime\Win64_x64\ MvCameraControl.dll”)

 

I WANT TO USE OPENCV WITH A PYTHON MR./MS. PROGRAM FOR CS/EG SERIES CAMERAS (IN THE CASE OF MONOCHROME)

The Python program for CS/EG series cameras is not coded for use with OpenCV. This is because it does not depend on any specific library and is a generic code.

The following is an example of “GrabImage.py”.

Python-OpenCV uses numpy, and the image data is treated as a numpy array. In GrabImage.py, we use MV_CC_GetOneFrameTimeout() within work_thread() to get the image, but the image data is passed in pData.

However, this is a pointer in C and cannot be used in OpenCV as is. That’s why we need to convert this pData to a numpy array.

1) Add the following two lines of import statements:

import numpy as np
import cv2

 

2) __main__ ret = cam. Before the MV_CC_StartGrabbing () line, add the following line to set the image data format output from the camera to monochrome.

ret = cam. MV_CC_SetEnumValueByString(“PixelFormat”,” Mono8″)

 

3) Inside the work_thread(), add code to convert pData to numpy array. (I’ll also add two lines: cv2.imshow() and cv2.waitKey() so you can see the image)

 

def work_thread(cam=0, pData=0, nDataSize=0):

    stFrameInfo = MV_FRAME_OUT_INFO_EX()

    memset(byref(stFrameInfo), 0, sizeof(stFrameInfo))

    while True :

        ret = cam.MV_CC_GetOneFrameTimeout(pData, nDataSize, stFrameInfo, 1000)

        if ret == 0:

            print (“get one frame: Width[%d], Height[%d], nFrameNum[%d]”  % (stFrameInfo.nWidth, stFrameInfo.nHeight, stFrameInfo.nFrameNum))

            image = np.asarray(pData._obj)

            image = image.reshape((stFrameInfo.nHeight, stFrameInfo.nWidth))

            cv2.imshow(“show”, image)

            cv2.waitKey(1)

        else:

            print (“no data[0x%x]” % ret)

        if g_bExit == True:

            break

 

 

I WANT TO USE OPENCV IN THE PYTHON MR./MS. PROGRAM OF CS/EG SERIES CAMERAS (FOR RGB)

The Python program for CS/EG series cameras is not coded for use with OpenCV. This is because it does not depend on any specific library and is a generic code.

 

In the following, we will use “GrabImage.py” as an example.

Python-OpenCV uses numpy, and image data is treated as a numpy array. In GrabImage.py, the image is acquired by MV_CC_GetOneFrameTimeout () in the work_thread (), and the image data is passed in pData, but this is a pointer in C language, and it cannot be handled by OpenCV as it is. That’s why we need to convert this pData to a numpy array.

 

1) Add the following two lines of import statements:

  • import numpy as np
  • import cv2

2) __main__ ret = cam. Before the MV_CC_GetIntValue(“PayloadSize”, stParam) line, add the following line to set the image data format output from the camera to RGB.

 

  • ret = cam.MV_CC_SetEnumValueByString(“PixelFormat”, “RGB8Packed”)

The RGB8Packed part varies depending on the camera, so open MVS, connect the camera, and select the appropriate string from Feature Tree → Image Fromat Control → Pixel Format.

 

Display in MVS Corresponding string
RGB 8 RGB8Packed
BGR 8 BGR8Packed

 

 

RGB

 

 

3) Inside the work_thread(), insert the code that converts pData to numpy arrays. (I’ll also add two lines, cv2.imshow() and cv2.waitKey(), so you can see the image.) 

 

def work_thread(cam=0, pData=0, nDataSize=0):

    stFrameInfo = MV_FRAME_OUT_INFO_EX()

    memset(byref(stFrameInfo), 0, sizeof(stFrameInfo))

    while True :

        ret = cam.MV_CC_GetOneFrameTimeout(pData, nDataSize, stFrameInfo, 1000)

        if ret == 0:

            print (“get one frame: Width[%d], Height[%d], nFrameNum[%d]”  % (stFrameInfo.nWidth, stFrameInfo.nHeight, stFrameInfo.nFrameNum))

            image = np.asarray(pData._obj)

            image = image.reshape((stFrameInfo.nHeight, stFrameInfo.nWidth,3))

            cv2.imshow(“show”, image)

            cv2.waitKey(1)

        else:

            print (“no data[0x%x]” % ret)

        if g_bExit == True:

            break

 

 

I WANT TO USE OPENCV IN THE PYTHON MR./MS. PROGRAM OF CS/EG SERIES CAMERAS (IN THE CASE OF BAYER)

The Python program for CS/EG series cameras is not coded to rely on OpenCV to be compatible with OpenCV. It is a generic code that does not depend on any specific library.

 

In the following, we will use “GrabImage.py” as an example.

 

Python-OpenCV treats image data as numpy arrays, so you need to adapt to its data format. In GrabImage.py, we use MV_CC_GetOneFrameTimeout() in work_thread() to get the image, and the image data is passed in pData, which is equivalent to a pointer in C.

 

As it is, it cannot be handled by OpenCV, so we need to convert this pData to a numpy array. Follow the steps below to do the conversion.

 

1) Add the following two lines of import statements:

  • import numpy as np
  • import cv2

2) __main__ ret = cam. Before the MV_CC_StartGrabbing () line, add the following line to set the image data format output from the camera.

  • ret = cam.MV_CC_SetEnumValueByString(“PixelFormat”,”BayerRG8“)

The BayerRG8 part varies depending on the camera, so open MVS, connect the camera, and select the appropriate string from Feature Tree → Image Fromat Control → Pixel Format.

 

Display in MVS Corresponding string
 Bayer RG 8

 BayerRG8

 Bayer GR 8

 BayerGR8

 Bayer BG 8

 BayerBG8

 Bayer GB 8

 BayerGB8

 

PythonサンプルプログラムでOpenCVを使いたい件01

 

 

3)Inside work_thread(), add code to convert pData to numpy arrays. (I’ll also add two lines, cv2.imshow() and cv2.waitKey(), so that you can check the image.) 

def work_thread(cam=0, pData=0, nDataSize=0):

    stFrameInfo = MV_FRAME_OUT_INFO_EX()

    memset(byref(stFrameInfo), 0, sizeof(stFrameInfo))

    while True :

        ret = cam.MV_CC_GetOneFrameTimeout(pData, nDataSize, stFrameInfo, 1000)

        if ret == 0:

            print (“get one frame: Width[%d], Height[%d], nFrameNum[%d]”  % (stFrameInfo.nWidth, stFrameInfo.nHeight, stFrameInfo.nFrameNum))

            image = np.asarray(pData._obj)

            image = image.reshape((stFrameInfo.nHeight, stFrameInfo.nWidth))

            image = cv2.cvtColor(image, cv2.COLOR_BayerBG2BGR)

            cv2.imshow(“show”, image)

            cv2.waitKey(1)

        else:

            print (“no data[0x%x]” % ret)

        if g_bExit == True:

            break

 

cv2. Enter a character string corresponding to the character string you just entered in the COLOR_BayerBG2BGR part.

 

  Corresponding string
 BayerRG8

 cv2.COLOR_BayerBG2BGR

 BayerGR8

 cv2.COLOR_BayerGB2BGR

 BayerBG8

 cv2.COLOR_BayerRG2BGR

 BayerGB8

 cv2.COLOR_BayerGR2BGR

 

The notation of the second argument of cv2.cvtColor() does not match the intuitive Bayer sequence.

This is an OpenCV specification.

 

Specifications can be found below.

 

PythonサンプルプログラムでOpenCVを使いたい件02

 

 

If the color of the video is incorrect, you may have made a mistake in the second argument of cv2.cvtColor().

 

THE CAMERA CANNOT BE FOUND USING SOFTWARE THAT USES DIRECTSHOW WITH AN ACCOUNT THAT DOES NOT HAVE ADMINISTRATOR PRIVILEGES.

DirectShow source filters will not be enabled if you set up the driver with an account that does not have administrator privileges.

 

If you have set up the driver with an account that does not have administrator privileges, you can enable the DirectShow source filter by following the steps below.

 

1) Sign in again with an account that has administrator privileges.

 

2) Connect the camera and wait until the driver is installed.
The important thing is to connect the camera using an account with administrator privileges at least once after setting up the driver.

 

3) Restart Windows.

 

4) After restarting, sign in again using an account with administrator privileges.
At this time, the DirectShow source filter is enabled.

 

After performing this step, even if the account does not have administrator privileges,
DirectShow source filter can now be used.

High-performance image processing/image analysis software with Standard Material option WinROOF 2023

Easily digitize the evaluation of metal materials according to various industrial standards with one software!

In addition to various measurement functions, four measurements for metal material evaluation are possible!

<Measurement items>
・Graphite nodularity measurement
・DAS measurement
・Nonmetallic inclusion measurement
・Crystal grain size measurement

 

・Eliminates variation in measurement results!

・Compatible with new standards!

BRINELL HARDNESS TEST SOFTWARE (DENT DIAMETER READING SOFTWARE) BHN MESURE (MANUFACTURED BY NIPPON STEEL TECHNOLOGY CO., LTD.)

Brinell hardness measurement is possible with high precision and little individual variation

●Image processing technology enables highly accurate Brinell hardness measurement with little individual variation.

●Conforms to JIS Z 2243 and ASTM E10-08 Standard Test Method for Brinell Hardness of Metallic Materials, hardness value calculation and display conforms to JIS standards

●Easy operation with optional hand switch

I WANT TO PROGRAM A CS/ EG SERIES CAMERA IN PYTHON.

Python is one of the programming languages and has the following characteristics.

1. Simple syntax: Python’s syntax is very simple and easy to read, making it easy to learn even for beginners in programming.

2. Extensive libraries: Python has a large number of libraries that make it easy to perform a variety of tasks.

3. Multiplatform: Python runs on a variety of platforms, including Windows and Linux.

4. Open source: Python is open source and free to use, so anyone can use it.

5. Versatile: Python is suitable for a variety of applications, including web development, data analytics, machine learning, artificial intelligence, automation, and game development.

 

For these reasons, Python is now widely used by beginners and professional developers alike.

We also lend out cameras, so download the driver and check how it works (SDK is also included).

Even if it is a distribution that has been confirmed by us, please check the operation.

■ For Windows

Follow the setup instructions in the downloaded folder to complete the setup.

The Python Mr./ Ms. code can be found in the following location, so please refer to it when creating your program.

 

pythonサンプルコード1

 

 

 

■ For Linux

Also, since there are various variants of Linux, it is not possible to provide drivers for all distributions.

Even our verified distributions do not guarantee 100% operation.

Follow the setup instructions in the downloaded folder. The Python Mr./Ms. code can be found here:

Choose 32-bit or 64-bit according to your CPU. Please refer to the Mr./Ms. code to create a program.

 

pythonサンプルコード2

 

 

 

THINGS TO CHECK WHEN IT SUDDENLY STOPS WORKING

If your camera suddenly stops working, first check the following.
a) Try disconnecting and reconnecting the cable.

 

b) Is the camera correctly recognized as an imaging device in Device Manager?

Imaging devices have the following names.
For USB2.0 camera, “NET ICube_CVam Device USB20”
For USB3.0 camera, “NET ICube_CVam Device USB30”
There is no imaging device in the device manager or the name icon above!
It will not work if it is marked with or x.
In this case, try setting up the driver again
using an account with administrator privileges.

 

c) If it is recognized in b), does the name of the connected camera exist in the camera model on the top left in iControl?
“NET3 4133 CU” “NET 1044 CU” etc.

 

d) If the name of the camera is in iControl in c), is it possible to obtain an image by pressing the Start button?
(Connect the camera, start it, and see if an image appears)

 

e) If the image is not output in step d) and the image is pitch black,
and the frame value at the bottom has not increased, shut down the PC,
restart it, and perform step d) again.
If the frames value is increasing, the aperture may not be wide enough or the amount of light may be insufficient.

 

f) Is it possible to improve by lowering the clock value?

CAMERA CURRENT CONSUMPTION

<Target model>
DN2G-30 / DN2G-30K / DN2G-30BU / DN2G-30BUK
DN2RG-130 / DN2RG-130K / DN2RG-130BU / DN2RG-130BUK
DN2RG-200 / DN2RG-200K / DN2RG-200BU / DN2RG-200BUK

DN3G-30 / DN3G-30K / DN3G-30BU / DN3G-30BUK
DN3RG-130 / DN3RG-130K / DN3RG-130BU / DN3RG-130BUK
DN3RG-200 / DN3RG-200K / DN3RG-200BU / DN3RG-200BUK

200mA or less

 

 

<Target model>
DN2R-300 / DN2R-300K
DN2R-500 / DN2R-500K / DN2R-500BU / DN2R-500BUK
DN2R-1000 / DN2R-1000K / DN2R-1000BU / DN2G-1000BUK

DN3R-300 / DN3R-300K
DN3R-500 / DN3R-500K / DN3R-500BU / DN3G-500BUK
DN3R-1000 / DN3R-1000K / DN3R-1000BU / DN3G-1000BUK

DN3V-300 / DN3V-300K
DN3V-500 / DN3V-500K / DN3V-500BU / DN3V-500BUK
DN3V-1000 / DN3V-1000K / DN3V-1000BU / DN3V-1000BUK

300mA or less

I WANT TO PROGRAM A CS/EG SERIES CAMERA ON LINUX.

We recommend using a UVC camera when programming in a Linux environment.

In general, UVC cameras are recognized and easy to use on Linux without installing any drivers.

In addition, in combination with OpenCV, it is relatively easy to program.

 

However, UVC cameras may not be available for the following reasons:

・When long-distance wiring is required.

If you want to use a camera with a resolution that is not included in our industrial UVC camera lineup.

・ When it is difficult to distinguish the dropped frame.

・ If OpenCV does not have a white balance function (there is a solution).

・If you want to actively use camera functions that cannot be set by OpenCV (e.g., trigger delay setting, etc.).

 

In this case, you need to set up a driver for Linux for your industrial camera and make it available on Linux.

There are various derivatives of Linux. As a result, there are very few applications that work on all Linux distributions.

Our industrial cameras (CS/EG series) also do not provide drivers for all distributions.

You can check the Linux distributions that have been confirmed to work from the following.

You can also borrow a camera, so download the driver and check the operation (SDK is also included).

In addition, even if it is a distribution that has been confirmed by us, please check the operation.

 

Shodensha’s Industrial USB Camera GigE Camera Lineup

 

 

USB Camera

Lineup list

 

400,000 pixels ~ 20,000,000 pixels
Color/Monochrome

We have a good lineup.

 

 

 

 

GigE Camera

Lineup list

 

400,000 pixels ~ 20,000,000 pixels
Color/Monochrome

We have a good lineup.

 

WHO IS THE MANUFACTURER OF THE IMAGE SENSOR?

●DN3G-30 / DN3G-30K / DN3G-30BU / DN3G-30BUK:
MT9V032 ON Semiconductor (formerly Aptina)

 

●DN3RG-130 / DN3RG-130K / DN3RG-130BU / DN3RG-130BUK:
EV76C560 e2v technologies

 

●DN3RG-200 / DN3RG-200K / DN3RG-200BU / DN3RG-200BUK:
EV76C570 e2v technologies

 

●DN3R-300/DN3R-300K:
MT9T001 ON Semiconductor (formerly Aptina)

 

●DN3R-500/DN3R-500K:
MT9P001 ON Semiconductor (formerly Aptina)

 

●DN3R-500BU/DN3R-500BUK:
MT9P031 ON Semiconductor (formerly Aptina)

 

●DN3R-1000 / DN3R-1000K / DN3R-1000BU / DN3R-1000BUK:
MT9J003 ON Semiconductor (formerly Aptina)

WHERE CAN I FIND THE MANUAL FOR MVS?

When you install MVS, the instructions are also installed.

 

This article describes where to find the MVS documentation.

■ For Windows

 

If you installed MVS on Windows, you will be provided with two instructions.

①「pdf.pdf」

One is in PDF format, where you can find detailed information and instructions.

 

②「index.html」

The other one is in HTML format, which allows you to check detailed information and instructions through your browser.

 

■ For Linux

MVSの説明書02

 

If you installed MVS in a Linux environment, one instruction manual is provided.

 

「pdf.pdf」

Its instructions are in PDF format, where you can find detailed information and instructions.

 

HOW TO RECORD VIDEO WITH AN EXTERNAL SIGNAL (TRIGGER SIGNAL)

Our CS series of USB cameras and EG series of GigE cameras are equipped with an external signal input terminal on the camera to support image storage from external signals.

(An optional trigger cable is required to input the signal.) 

 

However, the standard software does not have a built-in function to save from external signals, so you need to use the optional software or create a program using the included SDK.

 

There are two main types of optional software that saves still images with the input of an external signal.

 

The main difference is between the type that always displays live in the external signal standby state and the type that displays still images saved by the most recent signal without live display in the external signal standby state.

The advantage of the always-on live display type is that you can always check the video. Conversely, you can’t check the saved images without accessing the save folder.

Always-Live Display Type HiTriggerQ

The advantage of the still image display type saved by the last signal is that you can always check whether the image is saved normally.

On the other hand, if you want to display live video, you need to cancel the external signal standby state.

Still image storage type saved by the last signal HiTriggerF Pro

If you want to create a program, please use the SDK in the camera driver installation folder.

ドライバーインストールフォルダにあるSDKをご活用ください

 

If you want to save an image with an external trigger on a camera that does not have an external signal input terminal, you can also use a ready-made input box or other method to call the shortcut key of the software from the outside and save it.

 

 

 

ABOUT THE PORT OF THE COMPUTER TO WHICH THE USB 3.0 CAMERA IS CONNECTED

USB 3.0 cameras that support the USB 3.0 standard are characterized by high power supply capability and communication speed. However, when you connect it to your computer, it may not work properly unless you connect it to the appropriate port.

In this article, we will introduce the appropriate port for connecting a USB 3.0 camera to a computer.

USB port when connecting a USB 3.0 camera

 

To use a USB 3.0 camera, you need to connect it to a USB 3.0 (or USB 3.1, USB 3.2) port on your computer.

When a USB 3.0 camera is connected to a USB 2.0 port, it may not work properly due to power supply capacity or communication speed.

 

 

■ Difference between USB 1.1, 2.0 and 3.0

 

Name

Maximum data transfer rate

Power delivery capacity

USB1.1

12Mbps

5V 100mA

USB2.0

480Mbps

5V 500mA

USB3.2 Gen1(USB3.0)

5Gbps

5V 900mA

USB3.2 Gen2

10Gbps

5V 900mA

USB3.2 Gen2x2

20Gbps

5V 900mA

※USB 3.0 was renamed to USB 3.1 Gen 1 in 2013 and USB 3.2 Gen 1 in 2019.

 

As mentioned above, the maximum data transfer speed (communication speed) of USB 2.0 is 40 times that of USB 1.1, and USB 3.0 is about 400 times that of USB 1.1.

 

How to check if the USB port is 3.0

Here are three ways to check if your computer’s port is USB 3.0.

Method 1 Check the color of the port.

The standard color of USB 3.0 ports is blue, and it is generally expected to be blue.

USB3.0ポートの確認1

However, in rare cases, USB 3.0 may have a port of a color other than blue, or a blue port may not be USB 3.0.

In such a case, please refer to [Method 2] and [Method 3].

Method 2】Check the symbol mark of the port

If the symbol mark “SS (SuperSpeed)” is displayed near the USB 3.0 terminal, you know that the port is USB 3.0.

USB3.0ポートの確認2

Check the symbol mark of the port. If the symbol mark “SS (SuperSpeed)” is displayed near the USB 3.0 terminal, you know that the port is USB 3.0.

Method 3】: Check from your computer’s Device Manager.

You can check the USB port standard from your computer’s Device Manager.

(Device Manager > Universal Serial Path Controller)

USB3.0ポートの確認3

※ The image is from Windows 10. Also, the type and number of ports vary depending on the computer you are using.

Precautions when connecting a USB 3.0 camera to a computer

As of 2023, most computers on the market are equipped with USB 3.0 ports.

 

However, some computers with multiple ports are equipped with both “USB 3.0” and “USB 2.0” ports.

USB3.0ポートの確認4

 

Therefore, please keep the following points in mind when connecting the USB 3.0 camera to your computer.

・ Check the USB port of the computer and connect the camera.

・ Connect the camera directly to a computer without using a USB hub.

※ Depending on the hub, the power and communication speed may be insufficient, and it may not operate properly.

Summary

・The USB 3.0 camera is a camera with high power supply capacity and communication speed that supports the USB 3.0 standard.

・Connect to a USB 3.0 (or USB 3.1, USB 3.2) port for proper use.

 

The USB connection type camera provided by our company is a USB 3.0 camera that supports the USB 3.0 standard.

 

 

 

Industrial USB camera

 

 

 

IF YOUR GIGE CAMERA IS NOISY (E.G. BLACK LINES)

Bayer refers to an array of color filters placed on an image sensor in an industrial camera.

In such a case, you can try the following techniques to resolve the issue.

 

Example: When noise occurs with HiTrigger F-PRO

 

If noise occurs, operate “Settings” → “Camera” from the software menu.
Open the camera properties screen below.

 

Decrease the value of the item “Limit Frame Rate” in the red frame and press “OK”.

 

 

Tracking is reduced, but noise has been resolved.

 

WHAT IS BAYER?

Bayer refers to the color filter array placed on the image sensor in industrial cameras.

In a single-plate camera, the image sensor can only detect brightness (black and white or monochrome), so red, green, and blue color filters are placed on top of each pixel to obtain color information.

These color filters consist of R, G, and B filters arranged in a 2×2 matrix.

Specifically, the arrangement is as follows.

 

Bayer RGGB

 

Bayer BGGR

ベイヤーパターン04   ベイヤーパターン01
     

Bayer GBRG

 

Bayer GRBG

ベイヤーパターン02   ベイヤーパターン03

 

These four patterns are called “Bayer patterns” or “Bayer arrays”.

The camera uses one of these patterns, and the color filter covers the entire image sensor in a repetition of this pattern.

The fact that green is among the two patterns takes into account that the human eye is more sensitive to green than red or blue.

Image data obtained through the color filter of the Bayer pattern is called “RAW data” and does not produce a correct color image even if it is displayed as it is.

Eventually, it is properly processed and converted to a color image, such as the RGB image format.

WHO IS THE MANUFACTURER OF THE IMAGE SENSOR?

●DN3G-30 / DN3G-30K / DN3G-30BU / DN3G-30BUK:
MT9V032 ON Semiconductor (formerly Aptina)

 

●DN3RG-130 / DN3RG-130K / DN3RG-130BU / DN3RG-130BUK:
EV76C560 e2v technologies

 

●DN3RG-200 / DN3RG-200K / DN3RG-200BU / DN3RG-200BUK:
EV76C570 e2v technologies

 

●DN3R-300/DN3R-300K:
MT9T001 ON Semiconductor (formerly Aptina)

 

●DN3R-500/DN3R-500K:
MT9P001 ON Semiconductor (formerly Aptina)

 

●DN3R-500BU/DN3R-500BUK:
MT9P031 ON Semiconductor (formerly Aptina)

 

●DN3R-1000 / DN3R-1000K / DN3R-1000BU / DN3R-1000BUK:
MT9J003 ON Semiconductor (formerly Aptina)

WARNING C4996 APPEARS

It has been reported that the following warning appears in the ICubeSDKSample_x32_x64_vs2010 project on Windows [2.0.4.6].

 

a) warning C4996: ‘MBCS_Support_Deprecated_In_MFC’: MBCS support in MFC is deprecated and may be removed in a future version of MFC.
Warnings can be suppressed by adding the following line below “#define VC_EXTRALEAN” in stdafx.h.
#define NO_WARN_MBCS_MFC_DEPRECATION

 

b)warning C4996: ‘CWinApp::Enable3dControlsStatic’: CWinApp::Enable3dControlsStatic is no longer needed. You should remove this call.
This warning appears when “Use MFC” is set to “Use MFC in static library”.
The Enable3dControlsStatic() function is an old specification and is no longer needed, so please comment out the following line in CICubeSDKSampleApp::InitInstance() of ICubeSDKSample.cpp.
Enable3dControlsStatic(); // Diese Funktion bei statischen MFC-Anbindungen aufrufen