
Citation: Xin Nian, Yasuhiro Nagai, Cameron Jeffers, Kara N. Maxwell, Hongtao Zhang. Dietary influence on estrogens and cytokines in breast cancer[J]. AIMS Molecular Science, 2017, 4(3): 252-270. doi: 10.3934/molsci.2017.3.252
[1] | Xiaolin Gui, Yuanlong Cao, Ilsun You, Lejun Ji, Yong Luo, Zhenzhen Luo . A Survey of techniques for fine-grained web traffic identification and classification. Mathematical Biosciences and Engineering, 2022, 19(3): 2996-3021. doi: 10.3934/mbe.2022138 |
[2] | Aigou Li, Chen Yang . AGMG-Net: Leveraging multiscale and fine-grained features for improved cargo recognition. Mathematical Biosciences and Engineering, 2023, 20(9): 16744-16761. doi: 10.3934/mbe.2023746 |
[3] | Keying Jin, Jiahao Zhai, Yunyuan Gao . TwinsReID: Person re-identification based on twins transformer's multi-level features. Mathematical Biosciences and Engineering, 2023, 20(2): 2110-2130. doi: 10.3934/mbe.2023098 |
[4] | Wenzhuo Chen, Yuan Wang, Xiaojiang Tang, Pengfei Yan, Xin Liu, Lianfeng Lin, Guannan Shi, Eric Robert, Feng Huang . A specific fine-grained identification model for plasma-treated rice growth using multiscale shortcut convolutional neural network. Mathematical Biosciences and Engineering, 2023, 20(6): 10223-10243. doi: 10.3934/mbe.2023448 |
[5] | Yamei Deng, Ting Song, Xu Wang, Yonglu Chen, Jianwei Huang . Region fine-grained attention network for accurate bone age assessment. Mathematical Biosciences and Engineering, 2024, 21(2): 1857-1871. doi: 10.3934/mbe.2024081 |
[6] | Ruirui Han, Zhichang Zhang, Hao Wei, Deyue Yin . Chinese medical event detection based on event frequency distribution ratio and document consistency. Mathematical Biosciences and Engineering, 2023, 20(6): 11063-11080. doi: 10.3934/mbe.2023489 |
[7] | Xinlin Liu, Viktor Krylov, Su Jun, Natalya Volkova, Anatoliy Sachenko, Galina Shcherbakova, Jacek Woloszyn . Segmentation and identification of spectral and statistical textures for computer medical diagnostics in dermatology. Mathematical Biosciences and Engineering, 2022, 19(7): 6923-6939. doi: 10.3934/mbe.2022326 |
[8] | Bo Wang, Yabin Li, Xue Sui, Ming Li, Yanqing Guo . Joint statistics matching for camera model identification of recompressed images. Mathematical Biosciences and Engineering, 2019, 16(5): 5041-5061. doi: 10.3934/mbe.2019254 |
[9] | Yanmei Jiang, Mingsheng Liu, Jianhua Li, Jingyi Zhang . Reinforced MCTS for non-intrusive online load identification based on cognitive green computing in smart grid. Mathematical Biosciences and Engineering, 2022, 19(11): 11595-11627. doi: 10.3934/mbe.2022540 |
[10] | Kittur Philemon Kibiwott , Yanan Zhao , Julius Kogo, Fengli Zhang . Verifiable fully outsourced attribute-based signcryption system for IoT eHealth big data in cloud computing. Mathematical Biosciences and Engineering, 2019, 16(5): 3561-3594. doi: 10.3934/mbe.2019178 |
Camera devices are typical pieces of Internet of Things (IoT) equipment ubiquitously deployed in cyberspace and play an important role in maintaining safety in daily life and industrial operations. However, camera devices are usually technically heterogeneous and geographically dispersed, and it is time-consuming for devices owners to test which cameras are online and whether they are functioning normally. Camera devices are becoming attractive targets of cyber-attacks lack of security management. In 2016, up to 1.5 million webcams were attacked by the Mirai malware. The compromised webcams launched a large-scale distributed denial-of-service (DDoS) attack against some high-profile network infrastructure, causing paralysis of half of the U.S. internet. Therefore, accurately identifying the fine-grained type of device is of great significance for asset management. Conducting cyberspace resource surveying and mapping [1], assessing the impact of equipment vulnerabilities [2], and improving the effectiveness of network device governance [3] also requires understanding the type of devices.
Precise identification of camera devices is helpful for asset management, vulnerability assessment, and patch upgrading and thus is the groundwork for security management. Concurrently, there are two main kinds of methods for device identification, traffic-based, and web search-based methods—described as follows.
Traffic-based identification approaches can be classified into two main areas: active probing and passive monitoring.
Active Traffic Probing. A server sends a request message to a remote host using a specific application protocol (such as HTTP, FTP, POP3, etc.) via an IP address and extracts features from the response. Features match with a pre-established device fingerprint to establish the identification of the device. Shodan [4] is the world's first search engine for networked devices based on this technology. It uses Nmap [5] to periodically perform port scans on approximately 600 million devices around the world and returns banner information through processing to identify specific devices. Scholars at the University of Michigan used the self-developed Zmap [6] scanning tool to build the Censys [7] system to search for devices. In addition, ZoomEye, FoFa, and Oshadan are examples of the implementation of active probing [8], which has strong scalability. For new device types, a fingerprint is added to the fingerprint database to establish identification.
This method can customize the request packet to obtain a specific response message text, feature data are easy to obtain. However, with the enhancement of security protection strategies, more and more devices no longer respond to request messages. In this case, the response data is not obtained and the device identification is invalid.
Passive Traffic Monitoring. This technique monitors network traffic without sending any messages. First, it extracts various characteristics such as protocol parameters, packet fingerprints, or communication patterns from the traffic. Second, it establishes an appropriate identification model leveraging machine learning theory is established. Finally, it identifies the device types using the identification model.
Y. Meidan et al. [9] took a quadruple composed of a source IP, destination IP, port number, and flag bits (SYN, FIN, ACK, etc.) in the TCP session data as characteristics. They used several machine learning approaches to classify and identify IoT devices.
Miettinen M et al. [10] extracted a total of 23 features (e.g., the protocol types of each layer of the packet, source, and destination port number) from network traffic. They built the SENTINEL system by using the random forest method to classify IoT devices.
Arunan Sivanathan et al. [11] extracted features such as flow duration, port number, domain name, and cipher suite. They built a multistage classification model based on naive Bayes and random forest methods.
Cheng et al. [12] took the differences among the device file headers in traffic as features and used classification algorithms such as a backpropagation neural network [13], support vector machine [14], and k-nearest neighbors [15] to identify device types.
Yang et al. [16] took advantage of the characteristics of device network protocols at different open systems interconnection layers and neural network algorithms to generate the fingerprints of IoT devices for device identification.
Arunan et al. [17] used traffic characteristics obtained at the network level for IoT device classification. They presented insights into the underlying network traffic characteristics using statistical attributes such as activity cycles, port numbers, signaling patterns, and cipher suites. The work developed a multistage machine learning-based classification algorithm to identify specific IoT devices.
Sakthi et al. [18] proposed GTID, a wireless device identification technique based on traffic measurement metrics using the Ping and iPerf tools. They collect the interarrival time (IAT) of messages from the switch as features to create a unique and reproducible device signature and use artificial neural networks (ANNs) for classification.
This type of method circumvents the problem of non-responsive messages due to security policies, but it also limits the acquisition of high-value features and reduces the accuracy of device classification and identification.
Web-search-based. Web-search-based device identification method acquires knowledge of IoT devices—i.e., IoT devices and related vendors, products, and models—to build a device information repository or establish annotation rules.
Zou et al. [19] proposed an IoT device recognition framework based on web searches, which identified the brand and model of IoT devices by matching their protocol banners with a product attributes database established by crawling specific electronic business websites.
Feng [20] proposed an ARE engine that extracts relevant terms from the response data as search query crawl websites. They used an association algorithm to generate rules for IoT device annotations based on vendors, products, and models.
Agarwal et al. [21] developed a tool named WID that captures the web pages of IoT devices and performs device type classification by analyzing the source code of the web pages.
This type of approach does not need manual construction of fingerprints or training data to identify device types, but the accuracy relies on the reliability of Internet resources and requires the design of very complex rules for matching attribute information, which limits the accuracy and recall of the algorithm.
In addition, there is a device type identification method based on clock deviation [22,23,24,25,26,27]. This type of method uses the deviation that still exists after each device is calibrated by the network time protocol (NTP) to classify IoT devices. However, the clock deviation of a device is very difficult to measure, and it is difficult to distinguish the difference in the clock deviation among devices from the same manufacturer. This method of identifying the specific type of device can cause large errors.
The above methods have high accuracy in identifying coarse-grained device types. However, it is a challenge to accurately identify fine-grained types of devices. We propose a fine-grained camera device identification method based on the inherent features of devices. Inherent features represent characteristics of the device itself, such as device-specific parameters, geometric shapes, physical properties, and technical parameters. There are few research studies based on inherent features for classifying and identifying devices.
We believe that inherent features can be used to effectively identify the fine-grained device. When distinguishing a device, our method determines the weight of each feature based on the coverage and differences of the inherent features. We develop a feature similarity calculation strategy (FSCS) based on the manifestation of an inherent feature value. Based on each weight and the FSCS, we build a device identification model to establish the fine-grained identification of a device type. The detailed contribution of work is given below:
1) We propose a new feature weight determination strategy. The method proposed in this paper selects features based on the coverage of inherent attributes. The greater the feature differences among different types are, the better the distinction of the features. For the inherent selection characteristics, we calculate a feature entropy value based on the feature differences, use the feature entropy value to determine a feature weight, and provide a feature weight determination strategy based on a theoretical foundation.
2) We design a more reasonable feature similarity calculation rule. The inherent characteristics are classified into "phrase type", "numerical value", "interval type", and "collection type" according to the expression form. Similarity calculation rules are developed for each type of inherent characteristic. Compared with a strategy that uses a single calculation rule to calculate all types of similarity, the classification similarity calculation rule is more concise, and the calculation cost is lower.
3) We construct a fine-grained camera equipment type identification model based on inherent characteristics. Based on the feature weights and similarity calculation rules, we design a device type identification model using the idea of weighted average. This model can recognize fine-grained types of target devices with high accuracy even when some inherent feature values are missing, and it has good applicability in a real environment.
The rest of the paper is as follows: Section 2 introduces the detailed methodology and steps of the method. The rationality and reliability analysis of the method is given in Section 3. Section 4 presents the experimental results and analysis. In the end, Section 5 concludes our paper and discusses future work.
In this section, we explain our proposed method in detail. We first present the basic framework, and we explain the notations of our algorithm. Then, we analyze the FSCS. Finally, we build the device identification model. Our proposed basic framework is shown in Figure 1, which includes three phases described as follows.
Extracting the inherent features. For data about the inherent attributes (e.g., size, weight, parameter, etc.) of a device fi,n in various types of equipment, we select the inherent attributes with large differences and wide coverage as the inherent features.
Building the identification model. First, we design the calculation rules of feature similarity using the manifestation of the inherent features. Then, the feature weights are determined from the information entropy of the features in the dataset. Finally, we construct the identification model according to the calculation rules of feature similarity and the weights of the features.
Identifying the device type. Based on the inherent features of the target device, the similarity between the target device and known devices is calculated to identify the type of the target device, and the device model with the largest similarity value is considered as the model of the target device.
The notations used to describe our proposed approach and the means of evaluating them are summarized below.
f: An inherent feature of the device. fi,n is the nth feature of the ith device.
F: A collection of all inherent features of the device, and the inherent features of the ith device are Fi={fi,1,fi,2,…,fi,n}
Ti: The type of device I. Tno indicates the unknown device type; that is, if the type of device i is unknown, then Ti=Tno.
devi: A two-tuple composed of the inherent features and type of device i, devi=<Fi,Ti>.
Dbase: A knowledge set defined as the collection of all known types of devices, Dbase={<Fbase,1,Tbase,1>,<Fbase,2,Tbase,2>,…,<Fbase,n,Tbase,n>}, Fbase, n is the feature set of the nth device in the Dbase database and Tbase, n is the type of the nth device in the Dbase database. Tbase,i≠Tno(1≤i≤n).
Dtarget: The target set is defined as the set consisting of all types of devices to be identified, Dtarget={<Ftarget,1,Tno>,<Ftarget,2,Tno>,…,<Ftarget,s,Tno>}.
Similarity(fi,k,fj,k): The similarity of the k-th inherent feature between Fj and Fi. At this time, fi,k is called the benchmark feature, and fj,k is the target feature (note: Similarity(fi,k,fj,k) is not necessarily equal to Similarity(fj,k,fi,k)).
Similarity(Fi,Fj): The similarity of the feature set Fi andFj. At this time, Fi is called the benchmark feature set, and Fj is the target feature set (note: Similarity(Fi,Fj) is not necessarily equal to Similarity(Fj,Fi)).
Different inherent features are usually expressed in different forms. For example, the size of a device is usually "length × width × height" or "bottom radius × height", and the shape of a device is usually "square", "cylindrical" or "spherical". Therefore, it is necessary to design different feature similarity calculation rules for the features of different expressions. Analyzing the manifestation of the inherent features of the camera devices resulted in the division of the manifestation of the inherent features into four types: "phrase type", "numerical value", "interval type" and "collection type". We design similarity calculation rules for these four manifestations.
For "phrase type", when the feature phrases are the same, the feature similarity is 1; otherwise, it is 0. Considering the k-th feature of the inherent feature vectors Fbase and Fi as a "phrase type" feature, the similarity calculation strategy between the target feature fi,k and the benchmark feature fbase,k is shown in Eq (1).
Similarity(fbase,k,fi,k)={1, if fbase,k=fi,k0, if fbase,k≠fi,k | (1) |
For the "numerical type", the size of the feature value and the measurement error ε of the feature value are comprehensively considered. If the kth feature of the inherent feature vectors Fbase and Fi is a "numerical type" feature, we can obtain the similarity calculation strategy between the target feature fi,k and the benchmark feature fbase,k described as Eq (2).
Similarity(fbase,k,fi,k)={1−|fbase,k−(fi,k+ε)|max(fbase,k,fi,k), if fbase,k>fi,k 1, if fbase,k=fi,k1−|fbase,k−(fi,k−ε)|max(fbase,k,fi,k), if fbase,k<fi,k | (2) |
For "interval type", the calculation strategy is designed based on an interval inclusion relationship. If the kth feature of the inherent feature vectors Fbase and Fi is an "interval type" feature, then the similarity calculation strategy between the target feature fi,k and the benchmark feature fbase,k is constructed as Eq (3).
Similarity(fbase,k,fi,k)={1, if fi,k⊆fbase,k0, if fi,k⊄fbase,k | (3) |
For "collection type", the benchmark feature is used as a reference. The target feature and the benchmark feature are vectorized, and the cosine similarity is used to measure the feature similarity. If the kth feature of the inherent feature vectors Fbase and Fi is a "collection type" feature, where fbase,k=(ebase,1,ebase,2,⋯,ebase,n) represents the benchmark feature and fi,k=(ei,1,ei,2,⋯,ei,m) represents the target feature, suppose the vectorization of the benchmark feature is expressed as Vbase,k=(abase,1,abase,2,⋯,abase,n), and the vectorization of the target feature is expressed as Vi,k=(bi,1,bi,2,⋯,bi,n). Then, the vectorization process is as follows: for Vbase,k, ∀r∈Z+, 1⩽, then {a_{base, r}} = 1 ; for {V_{i, k}} , \forall r \in {Z^ + } , {\text{ }}1 \leqslant r \leqslant n . If {e_{base, r}} \in {f_{i, k}} , then {b_{i, r}} = 1 . After vectorization, the similarity calculation strategy is expressed as Eq (4).
Similarity\left( {{f_{base, k}}, {f_{i, k}}} \right) = \frac{{\sum\limits_{r = 1}^{\left\| {{f_{i, k}}} \right\|} {{b_{i, r}}} }}{{\sqrt {\left\| {{f_{base, k}}} \right\| \times \sum\limits_{r = 1}^{\left\| {{f_{i, k}}} \right\|} {b_{i, r}^2} } }} | (4) |
where {b_{i, r}} is the vectorization result of the r-th feature of the i-th device.
According to the differences in the inherent features, an entropy value of a feature is calculated as the weight value. By using the FSCS combined with the inherent feature weights, the device identification model is constructed. The determination strategy of the inherent feature weight is as follows.
Given a feature, for each distinct value v of the feature, let v be the number of occurrences of v in the knowledge set, and let f = v/N be the frequency of occurrence of v, where N is the size of the knowledge set. If we approximate f as the probability of v, we can calculate the entropy of the feature as Eq (5).
{H}_{k} = -{\displaystyle \sum {p}_{i}·\mathrm{log}\left({p}_{i}\right)} | (5) |
where {p_i} represents the probability function of the ith value of the feature {f_k} .
After calculating the entropy {H_1}, {H_2}, \cdots, {H_n} of all features, the weight value {w_1}, {w_2}, \cdots, {w_n} of each feature is calculated as Eq (6).
{w_i} = \frac{{{H_i}}}{{\sum\limits_{k = 1}^n {{H_k}} }} | (6) |
According to the similarity calculation strategy and the weight of each feature, a feature set similarity model is constructed as Eq (7).
Similarity\left( {{F_{base}}, {F_i}} \right) = \sum\limits_{k = 1}^{\left\| {{F_{base}}} \right\|} {Similarity\left( {{f_{base, k}}, {f_{i, k}}} \right) \times {w_k}} | (7) |
The identification model is used to calculate the similarity between the target device and the known devices. In the process of device type identification, the device type with the largest similarity is considered the device type of the target device, so the device type identification is as Eq (8).
{T_i} = {T_{base, k}}\left| {Similarity\left( {{F_{base, k}}, {F_i}} \right) = \max \left\{ {Similarity\left( {{F_{base, j}}, {F_i}} \right), < {F_{base, j}}, {T_{base, j}} > \in {D_{base}}} \right\}} \right. | (8) |
In this section, we'll analyze the effectiveness of the inherent feature based device identification and the rationality of the FSCS.
Devices of different models are usually different in inherent features. For example, as shown in Table 1, camera devices of different models are different in some inherent features such as appearance. size, weight. and so on. So, it's sound and feasible to distinguish devices by their inherent features. We should avoid identifying devices just by one single inherent feature since the measurement process of the value of an inherent feature may introduce deviation, as many as inherent features should be used instead.
Camera device model | Appearance | Size (mm) | Weight (g) | Imaging component (inches) | Compression rate (bps) |
Dahua DH-IPC-HDBW2230R-AS | Dome | φ122 × 89 | 406 | CMOS, 1/2.7 | 6K-8 M |
Dahua DH-IPC-HDBW4636R-AS | Dome | φ122 × 89 | 444 | CMOS, 1/2.9 | 73K-10 M |
Dahua DH-IPC-HDBW4833R-ZAS | Dome | φ122 × 88.9 | 494 | CMOS, 1/1.8 | 16K-14.75 M |
Dahua DH-IPC-HFW4433F-ZAS | Bullet | 186 × 87 × 85 | 772 | CMOS, 1/3.0 | 8K-10 M |
Hikvision DS-2CD1311D | Dome | φ127 × 97.5 | 570 | CMOS, 1/3.0 | 32K-16 M |
Hikvision DS-2CD3345P1 | Dome | φ127.3 × 103.7 | 340 | CMOS, 1/3.0 | 32K-16 M |
Hikvision DS-2CD3646F | Barrel | 191.4 × 97.9 × 93.5 | 1260 | CMOS, 1/2.7 | 32K-8 M |
Hikvision DS-2CD3726F | Dome | 153.3 × 153.3 × 111.6 | 840 | CMOS, 1/2.7 | 32K-8 M |
Table 1 shows that almost all devices of different types have different values of the "weight" inherent attribute, but it should be noted that in the actual classification process, the value of the "weight" inherent attribute is measured, and deviations are prone to occur during the measurement process. Using only the "weight" attribute as an inherent feature to identify devices results in large errors. It is necessary to use multiple inherent features to identify a camera device. Therefore, the use of multiple inherent features in this method can more effectively identify a device.
There are 4 types of inherent features: "phrase", "numerical", "interval" and "collection", and each has a corresponding FSCS respectively.
A. Phrase
The inherent features of the "phrase" type are described by one or more words. For example, the inherent feature of the shape of a camera device may be described by words like "dome", "bullet", or "barrel". The FSCS of inherent features of the "phrase" type is determined by whether the phrases are the same, so it's reasonable to use Eq (1) to calculate the similarity of two features of the "phrase" type.
B. Numerical
For inherent features of the "numerical" type, the similarity depends on numerical differences, and the smaller the numerical difference between two features is, the larger the similarity. Suppose that the kth feature of the vectors {F_{base}} and {F_i} are both of "numerical" type, a naive calculation strategy for the similarity between the target and benchmark feature would be expressed as Eq (9).
Similarity\left( {{f_{base, k}}, {f_{i, k}}} \right) = \left\{ {\begin{array}{*{20}{c}} {{\text{ }}1, {\text{ }}if{\text{ }}{f_{base, k}} = {f_{i, k}}} \\ {\frac{1}{{\left| {{f_{base, k}} - {f_{i, k}}} \right|}}, {\text{ }}if{\text{ }}{f_{base, k}} \ne {f_{i, k}}} \end{array}} \right. | (9) |
Note that the impact of \left| {{f_{base, k}} - {f_{i, k}}} \right| is relevant to the absolute value of {f_{base, k}} or {f_{i, k}} , so Eq (9) should be revised as Eq (10).
Similarity\left( {{f_{base, k}}, {f_{i, k}}} \right) = 1 - \frac{{\left| {{f_{base, k}} - {f_{i, k}}} \right|}}{{\max \left( {{f_{base, k}}, {f_{i, k}}} \right)}} | (10) |
Practically, values of inherent features of the "numerical" type are acquired by measurement. Due to the precision of measurement tools and human errors during the measurement process, there might be a deviation between the real value and measured value of a feature. So, we introduce an error tolerance \epsilon into Eq (10) and obtain Eq (2).
C. Interval
For inherent features of the "interval" type, the similarity is determined by the relationship between intervals. If the interval value of the benchmark feature is the same as the one of the target feature, the benchmark feature and the target feature are considered the same, and vice versa. Thus, suppose that the kth feature of the inherent feature vectors {F_{base}} and {F_i} are both of "interval" type, a naive calculation strategy for the similarity between the target feature {f_{i, k}} and the benchmark feature {f_{base, k}} would be described as Eq (11).
Similarity\left( {{f_{base, k}}, {f_{i, k}}} \right) = \left\{ {\begin{array}{*{20}{c}} {1, {\text{ }}if{\text{ }}{f_{i, k}} = {f_{base, k}}} \\ {0, {\text{ }}if{\text{ }}{f_{i, k}} \ne {f_{base, k}}} \end{array}} \right. | (11) |
Practically, the interval of a benchmark feature is usually complete, and the interval of a target feature is often a subset of the interval of the benchmark feature, so it might be a fault to calculate similarity using Eq (11). We think it's reasonable to use Eq (3) for evaluating the similarity of features of the "interval" type. When a target feature interval is a subset of the benchmark feature interval, the two features are similar, and vice versa. Note that when an interval value is numeric (denoted by a), the numeric value a should be regarded as an interval [a, a].
D. Collection
Features of the "collection" type contain multiple elements, and the similarity between features can be determined by the inclusion relationship between collections, or by cosine similarity between vectors after collection vectorization.
The calculation strategy of cosine similarity is considered to be more accurate than the strategy of inclusion relationship. Taking the "image resolution" feature as an example, suppose that the benchmark and target features are {f_{base}} = \{ a \times b, c \times d\} , {f_1} = \{ a \times b\} and {f_2} = \{ a \times b, c \times d\} , when using calculation strategy of inclusion relationship, the similarity values of {f_1} vs. {f_{base}} and {f_2} vs. {f_{base}} are both 1. however, when using calculation strategy of cosine similarity, the similarity value of {f_1} vs. {f_{base}} is 0.5 and value of {f_2} vs. {f_{base}} is 1. Obviously, the calculation strategy of cosine similarity gets a smaller error and is more reasonable.
However, "interval type" features are difficult to vectorize due to the continuous elements, so it is not suitable to use the cosine similarity to measure feature similarity. In contrast, the elements in the "collection type" feature are discrete, and the vectorization process is simple. It is undoubtedly a more reasonable choice to construct an FSCS based on the cosine similarity method with smaller errors. The above analysis shows that the inherent FSCS adopted in this method is reasonable.
In this section, we conduct two experiments: 1) when the feature sets are complete, a device type identification experiment verifies the feasibility of the method; 2) when features are missing, a device type identification experiment verify the effectiveness of the method.
We select 40 types of fine-grained devices each from Dahua and Hikvision camera products as the knowledge set of the experiment. The specific device types are shown in Table 2, while each device has a number for the convenience of recording and presentation.
No. | Device model | No. | Device model |
T01 | Dahua DH-IPC-HDBW1020R | T21 | Hikvision DC-2CD2T45 |
T02 | Dahua DH-IPC-HDBW2130R-AS | T22 | Hikvision DS-2CD1225 |
T03 | Dahua DH-IPC-HDBW2230R-AS | T23 | Hikvision DS-2CD1311D |
T04 | Dahua DH-IPC-HDBW4636R-AS | T24 | Hikvision DS-2CD3310F-I |
T05 | Dahua DH-IPC-HDBW4833R-ZAS | T25 | Hikvision DS-2CD3345P1 |
T06 | Dahua DH-IPC-HDPW4233-PT-SA-0360B | T26 | Hikvision DS-2CD3410FD-IW |
T07 | Dahua DH-IPC-HFW4433F-ZAS | T27 | Hikvision DS-2CD3646F |
T08 | Dahua DH-IPC-HFW4631K-AS | T28 | Hikvision DS-2CD3726F |
T09 | Dahua DH-IPC-HFW8841K-ZRL-DS | T29 | Hikvision DS-2CD3935FWD-IWS |
T10 | Dahua DH-IPC-PDBW4638-B270 | T30 | Hikvision DS-2CD3942F-I |
T11 | Dahua DH-CA-DW48 | T31 | Hikvision DS-2CD3A20F-IS |
T12 | Dahua DH-CA-FW19M-IR8 | T32 | Hikvision DS-2CD3T25-I5 |
T13 | Dahua DH-HAC-HDW2208E | T33 | Hikvision DS-2CD3T56WD-I5 |
T14 | Dahua DH-IPC-EW4431-ASW | T34 | Hikvision DS-IPC-E22H-IW |
T15 | Dahua DH-IPC-HDBW4243E-ZFD | T35 | Hikvision DS-IPC-T12-I |
T16 | ACTi ACM-3001 | T36 | TP-LINK TL-IPC223 |
T17 | ACTi ACM-5001 | T37 | TP-LINK TL-IPC313K-W10 |
T18 | ACTi ACM-3511 | T38 | TP-LINK TL-IPC43KZ |
T19 | ACTi ACM-5601 | T39 | TP-LINK TL-IPC546H |
T20 | ACTi ACM-7411 | T40 | TP-LINK TL-IPC646P-A4 |
For each device, the device shape ( {f_1} ), size ( {f_2} ), weight ( {f_3} ), imaging component ( {f_4} ), resolution ( {f_5} ), minimum illumination ( {f_6} ), lens parameters ( {f_7} ), compression code rate ( {f_8} ), electronic shutter time ( {f_9} ) and ambient temperature ( {f_{10}} ) are inherent features for device identification. The different inherent characteristic values of the different device types are shown in Table 3
No. | f1 | f2 (mm) |
f3 (g) | f4 (inch) | f5 | f6 (Lux) | f7 | f8 (bps) | f9 (s) | f10 (℃) |
T01 | Dome | 122×89 | 400 | CMOS, 1/4.0 | {704×576;1280×720} | 0.4;0.22; | {6, F2.6, 37;3.6, F2.5, 59;2.8, F2.5, 70.5} | 5K- > 5M | 1/3- > 1/10-5 | -30- > 60 |
T02 | Dome | 122×89 | 462 | CMOS, 1/3.0 | {704×576;1280×960;1280×720;704×480} | 0.01;0.001;0 | {8, F2.5, 35.5;6, F2.1, 47;3.6, F2.0, 72.2;2.8, F2.0, 91.7} | 12K- > 6M | 1/3- > 1/10-5 | -40- > 60 |
T03 | Dome | 122×89 | 406 | CMOS, 1/2.7 | {704×576;1920×1080;704×480} | 0.01;0.001;0 | {8, F2.2, 42;6, F2.0, 55;3.6, F2.0, 90;2.8, F2.0,115} | 6K- > 8M | 1/3- > 1/10-5 | -40- > 60 |
T04 | Dome | 122×89 | 444 | CMOS, 1/2.9 | {2688×1520;704×576;2592×1944;1280×720;3072×2048;704×480;2560×1440} | 0.002;0.0002;0 | {6, F2.5, 47.34;3.6, F2.2, 70;2.8, F2.0, 99} | 73K- > 10M | 1/3- > 1/10-5 | -30- > 60 |
T05 | Dome | 122×88.9 | 494 | CMOS, 1/1.8 | {704×576;1920×1080;3840×2160;704×480} | 0.002;0.0002;0 | {12, F2.8, 45.5;3.5, F1.9,110} | 16K- > 14.75M | 1/3- > 1/10-5 | -40- > 55 |
T06 | Dome | 124.6×82.3 | 265 | CMOS, 1/2.8 | {704×576;1920×1080;1920×10801;704×480} | 0.002;0.0002;0 | {3.6, F1.6, 87} | 12K- > 10M | 1/3- > 1/10-5 | -20- > 50 |
T07 | Bullet | 186×87×85 | 772 | CMOS, 1/3.0 | {704×576;1280×720;704×480;2592×1520;2560×1440} | 0.001;0.0001;0 | {13.5, F3.13, 27.7;2.7, F1.6,104.2} | 8K- > 10M | 1/3- > 1/10-5 | -30- > 60 |
T08 | Bullet | 194×96×89 | 640 | CMOS, 1/2.9 | {2688×1520;704×576;2592×1944;1280×720;3072×2048;704×480;2560×1440} | 0.002;0.0002;0 | {6, F2.5, 47.34;3.6, F2.2, 70;2.8, F2.0, 99} | 73K- > 10M | 1/3- > 1/10-5 | -40- > 60 |
T09 | Bullet | 221×109×101 | 1080 | CMOS, 1/1.8 | {704×576;1920×1080;3840×2160;1704×576;704×480} | 0.01;0.001;0 | {2.7, F1.2,111} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T10 | Dome | 285.1×100.8 | 2600 | CMOS, 1/2.8 | {704×576;1920×1080;1280×720;1280×960;704×480} | 0.002;0.0002;0 | {12, F2.7, 44;2.7, F1.8,105} | 16K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T11 | Dome | 113.6×85.4 | 300 | CCD, 1/1.3 | {976×582} | 0.001;;0 | {2.8, F1.2, 0} | 16K- > 8M | 1/50- > 1/10-5 | -30- > 60 |
T12 | Bullet | 194.4×96.6×89.5 | 400 | CMOS, 1/3.0 | {1280×720} | 0.01;; | {3.6, F1.2, 0} | 16K- > 8M | 1/25- > 1/10-5 | -30- > 60 |
T13 | Dome | 110×95 | 440 | CMOS, 1/2.8 | {1920×1080;1280×720} | 0.001;; | {3.6, F1.2, 0} | 16K- > 8M | 1/1- > 1/30000 | -30- > 60 |
T14 | Dome | 126×37.7 | 500 | CMOS, 1/3.0 | {2688×1520;704×576;1920×1080;1280×720} | 0.001;; | {1.6, F1.4,180} | 14K- > 40M | 1/3- > 1/10-5 | -10- > 50 |
T15 | Dome | 159.1×117.9 | 925 | CMOS, 1/2.8 | {704×576;1920×1080;704×480} | 0.002;0.0002;0 | {35, F1.8, 13;13.5, F3.05, 31;7, F1.4, 36;2.7, F1.6,110} | 16K- > 8M | 1/3- > 1/10-5 | -40- > 60 |
T16 | Dome | 130×99 | 350 | CMOS, 1/3.0 | {320×240;160×120;640×480} | ; ; | {4.2, F1.8, 0} | 1/5- > 1/2000 | 0- > 40 | |
T17 | Bullet | 67×55 | 400 | CMOS, 1/3.0 | {320×240;160×120;640×480} | ; ; | {4.2, F1.8, 0} | 1/5- > 1/2000 | 0- > 50 | |
T18 | Dome | 130×99 | 380 | CMOS, 1/3.0 | {320×240;160×112;640×480;1280×1024;1280×720} | ; ; | {3.3, F1.6, 0} | 1/5- > 1/2000 | -10- > 45 | |
T19 | Bullet | 67×55×129.5 | 400 | CMOS, 1/3.0 | {640×480;1280×720;1280×1024} | ; ; | {4.2, F1.8, 0} | 1/5- > 1/2000 | 0- > 50 | |
T20 | Dome | 151.69×114.9 | 1040 | CMOS, 1/3.0 | {640×480;1280×720;1280×1024} | ; ; | {3.3, F1.4, 0} | 1/10- > 1/2000 | -30- > 50 | |
T21 | Barrel | 90×85×169 | 550 | CMOS, 1/3.0 | {2560×1440} | 0.1;; | {1.68, F2.0,180} | 32K- > 16M | 1/3- > 1/10-5 | -30- > 60 |
T22 | Bullet | 100.5×88.1×157.3 | 600 | CMOS, 1/4.0 | {1280×720} | 0.01;; | {12, F1.2, 26;8, F1.2, 39;6, F1.2, 50;4, F1.2, 76;2.8, F1.2, 95} | 32K- > 8M | 1/25- > 1/10-5 | -30- > 60 |
T23 | Dome | 127×97.5 | 570 | CMOS, 1/3.0 | {1280×960} | 0.01;; | {16, F1.2, 18;12, F1.2, 22;8, F1.2, 33;6, F1.2, 50;4, F1.2, 69;2.8, F1.2, 90} | 32K- > 16M | 1/25- > 1/10-5 | -30- > 60 |
T24 | Dome | 114.6×89.4 | 670 | CMOS, 1/3.0 | {1280×960} | 0.01;; | {4, F2.0, 75.8} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T25 | Dome | 127.3×103.7 | 340 | CMOS, 1/3.0 | {2560×1440} | 0.01;; | {4, F1.2, 69;2.8, F1.2, 90} | 32K- > 16M | 1/3- > 1/10-5 | -10- > 40 |
T26 | Bullet | 66×139.1×70.6 | 400 | CCD, 1/3.0 | {1280×920} | 0.02;; | {4, F2.0, 75.8} | 32K- > 16M | 1/25- > 1/10-5 | -25- > 60 |
T27 | Barrel | 191.4×97.9×93.5 | 1260 | CMOS, 1/2.7 | {2560×1440} | 0.005;; | {2.8, F1.2,105} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T28 | Barrel | 153.3×153.3×111.6 | 840 | CMOS, 1/2.7 | {1920×1080} | 0.002;; | {2.7, F1.2,103} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T29 | Dome | 119.9×41.2 | 600 | CMOS, 1/2.8 | {1600×1200;1280×960;2048×1536} | 0.005;;0 | {1.16, F2.2,180} | 32K- > 16M | 1/3- > 1/10-5 | -10- > 40 |
T30 | Dome | 119.9×41.2 | 600 | CMOS, 1/3.0 | {2048×1536} | 0.01;;0 | {1.6, F1.6,186} | 32K- > 8M | 1/25- > 1/10-5 | -10- > 40 |
T31 | Barrel | 134×116.1×293 | 2000 | CMOS, 1/2.8 | {1920×1080} | 0.01;;0 | {25, F1.2, 12.4;16, F1.2, 19.2;12, F1.2, 24.6;8, F1.2, 40;6, F1.2, 52;4, F1.2, 85} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T32 | Bullet | 194.04×93.85×89.52 | 1000 | CMOS, 1/2.7 | {1920×1080;1280×960;1296×732;1280×720} | 0.01;; | {4, F1.2, 80} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T33 | Bullet | 93.85×93.52×194.1 | 690 | CMOS, 1/2.7 | {704×576;2560×1920;640×480;1280×720;2560×1536;1920×1280;352×288} | 0.005;; | {6, F1.2, 55;4, F1.2, 83} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T34 | Barrel | 175×89×75 | 340 | CMOS, 1/2.7 | {1920×1080} | 0.01;; | {8, F1.2, 43;6, F1.2, 54.4;4, F1.2, 89.1;2.8, F1.2,114.8} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 50 |
T35 | Dome | 110×93.2 | 350 | CMOS, 1/2.7 | {1920×1080} | 0.01;; | {8, F1.2, 43;6, F1.2, 54.5;4, F1.2, 91;2.8, F1.2,114.8} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T36 | Dome | 113×113×87 | 268 | CMOS, 1/2.7 | {1920×1080} | 0.1;0.1;0 | {4, F2.1, 0;6, F2.1, 0} | 64K- > 8M | 1/25- > 1/10-5 | -30- > 60 |
T37 | Barrel | 173×83.4×84.2 | 390 | CMOS, 1/3.0 | {1280×960} | 0.1;0.1;0 | {2.8, F2.1, 0;4, F2.1, 0;6, F2.1, 0} | 64K- > 4M | 1/25- > 1/10-5 | -10- > 60 |
T38 | Dome | 120×120×129 | 328 | CMOS, 1/2.7 | {2304×1296} | 0.1;;0 | {3.35, F2.4, 0} | 64K- > 2M | 1/25- > 1/10-5 | -30- > 50 |
T39 | Barrel | 173×83.4×84.2 | 390 | CMOS, 1/2.7 | {2560×1440} | 0.01;; | {4, F1.6, 0;6, F1.6, 0;8, F1.6, 0;12, F1.6, 0} | 256K- > 6M | 1/25- > 1/10-5 | -30- > 60 |
T40 | Ball | 230×177×188 | 930 | CMOS, 1/3.0 | {2560×1440} | 0.002;;0 | {4, F1.6, 0} | 64K- > 6M | 1/25- > 1/10-5 | -30- > 60 |
In Table 3, f1 and f4 are "phrase type" inherent features; f2, f3, and f6 are "numerical value" inherent features; f8, f9 and f10 are "interval type" inherent features; and f5 and f7 are "collective type" inherent features. The elements in f6 represent the minimum illumination of the color camera, black-and-white camera, and infrared camera. For example, in the f6 value of T01, "0.4; 0.22; -" indicates that the lowest illuminance of the T01 color camera is 0.4 lux, the minimum illuminance of the black-and-white camera is 0.22 lux, and the infrared camera is not supported. Each set element of f7 includes the focal length, aperture, and horizontal field of view. For example, in the f7 value of T01, "6, F2.6, 37" indicates that the focal length is 6 mm, the aperture is F2.6, and the horizontal field of view is 37 degrees.
From Eqs (5) and (6), we calculate the weight of each feature separately, and the results are {w_1} = 0.09089 , {w_2} = 0.{\text{11481}} , {w_3} = 0.{\text{11294}} , {w_4} = 0.0{\text{9217}} , {w_5} = 0.{\text{11}}0{\text{61}} , {w_6} = 0.{\text{1}}0{\text{381}} , {w_7} = 0.{\text{11596}} , {w_8} = 0.0{\text{9915}} , {w_9} = 0.0{\text{7682}} , and {w_{10}} = 0.0{\text{8285}} . In Sections 4.2 and 4.3, we use the calculated feature weights.
To verify the effectiveness of identifying device types using the inherent features, we carry out a device identification experiment with each feature set complete and calculate the similarity between any two devices.
We use the feature weights calculated in Section 4.1, set the error tolerance \varepsilon = 1 for the "numerical type" features, and calculate the difference between any two devices in {T01, T02..., T20} using Eq (7). Figure 2 shows the confusion matrix of the device similarity.
In Figure 2, "True Type" represents the actual fine-grained type of the device, and "Basic Type" represents the basic type. When "True Type" is "T02" and "Basic Type" is "T01", the similarity is 0.492, which means that the similarity between T02 and T01 is 0.492 in other words, Similarity\left({{F_{base, 1}}, {F_{target, 2}}} \right) = 0.492 .
By using the 10 inherent characteristics described in Section 4.1, Figure 2 shows that the similarity between any two devices is less than 0.85 (the maximum value is 0.842), which can effectively distinguish the devices, in other words, it proves the effectiveness of inherent features in distinguishing devices type. The areas with greater similarity are T1-T15, T16-T20, T21-T25, T36-T40, which indicates that the inherent features of the devices from the same manufacturer are more similar than the devices from different manufacturers.
Our experiment simulates a situation with incomplete device features in an actual environment. We discard some features or elements of collective features with a certain probability and accomplish fine-grained device identification with incomplete features.
The discarding probability of an inherent feature changes from 0 to 1 in increments of 0.01. At each discarding probability value, the target set with inherent missing features is generated using the knowledge set, and the similarity between each feature in the target set and the knowledge set is calculated separately, and the greatest similar device type is taken as the target device type. To ensure arbitrary discarding of probability values, the experiment is repeated 1000 times to obtain the average identification accuracy of each type of device.
The increase of feature discard probability makes the similarity value between fine-grained device types decrease, which is because when a feature value of the target device is empty (i.e., no feature is acquired), the feature similarity between the target device and the knowledge device will be 0. When the target device has fewer non-empty feature values, the similarity value between it and the knowledge set device type will be less. Therefore, we set a similarity threshold in the experiment. When the maximum similarity is less than the threshold, device identification is considered to fail (even if the type corresponding to the maximum similarity is the correct type). When the threshold value is set to 0.5, the graph of the variation between the average recognition accuracy and the discard probability of each type of device is obtained as shown in Figure 3.
In Figure 3, when the discard probability is less than 0.25, the average identification accuracy of the fine-grained type of the device is greater than 90%. When the discard probability is greater than 0.2, the average identification accuracy decreases rapidly with the increase of discard probability. When the discard probability is 0.4, the average identification accuracy is approximately 60%. When the discard probability is 0.8, the average identification accuracy is close to 0.
An increase in the discard probability inevitably leads to a decrease in the similarity value. While the similarity value of the correct device type decreases, the similarity value between other devices and the target device will also decrease. The strategy of setting the similarity threshold will directly lead to the failure of fine-grained type identification of the device when the discarding probability is large. Figure 4 shows the similarity matrix between the target set devices and the knowledge set devices during the 500th experiment when the discard probability is 0.5.
Figure 4 shows that when the discarding probability of an inherent feature is 0.5, the maximum similarity value (i.e., T10, T19, etc.) is less than 0.5. However, the device type with the maximum similarity value is the correct device type. Therefore, when a feature is missing, a similarity threshold should not be set (or the threshold should be set to 0).
Figure 5 shows that when there is no similarity threshold, the average identification accuracy of each device fine-grained type is still greater than 80% even if the discarding rate of inherent features is 50%. When the discard rate is 0.8, the average identification accuracy of each device is greater than 35%. This shows that the method proposed in this paper can still identify the fine-grained types of devices effectively when some inherent features are missing.
In this section, the experimental results based on real data show that the method proposed in this paper to identify the fine-grained type of device based on inherent features is feasible and effective. In reality, the more inherent features of the target device are obtained, the more accurate the identification of the target device will be. When the "missing rate" of inherent features is 50%, the average accuracy to identify the fine-grained type of device still exceeds 80%.
This paper develops an inherent feature-based device identification method to solve the current deficiency in fine-grained identification of types of camera devices. The method classifies devices inherent features into 4 types according to their expressive form, establishes a similarity calculation strategy (FSCS) for each type of inherent feature, assigns a weight value derived from information entropy for each inherent feature, and constructs a fine-grained device type identification model combining the FSCS and weight value. Experiment results show that the method proposed in this paper can recognize fine-grained types of devices in the partial absence of inherent features. Especially, when the "absence rate" of inherent features is 50%, the average accuracy is still more than 80%. In future work, finding more effective inherent features and combining them with network features may be an important direction to improve the accuracy of fine-grained device type identification.
We would like to thank the reviewers for their positive suggestions, which helped to improve the paper enormously.
The authors declare that there is no conflict of interest.
[1] | Howlader N, Noone A, Krapcho M, et al. (2015) SEER Cancer Statistics Review, 1975–2012, National Cancer Institute. Bethesda, MD. 2015. |
[2] |
Tomasetti C, Li L, Vogelstein B (2017) Stem cell divisions, somatic mutations, cancer etiology, and cancer prevention. Science 355: 1330-1334. doi: 10.1126/science.aaf9011
![]() |
[3] | Cancer Research UK, Statistics on preventable cancers. 2017. Available from: http://www.cancerresearchuk.org/health-professional/cancer-statistics/risk/preventable-cancers#heading-One |
[4] |
Colditz GA, Hankinson SE (2005) The Nurses' Health Study: lifestyle and health among women. Nat Rev Cancer 5: 388-396. doi: 10.1038/nrc1608
![]() |
[5] |
Boeke CE, Eliassen AH, Chen WY, et al. (2014) Dietary fat intake in relation to lethal breast cancer in two large prospective cohort studies. Breast Cancer Res Treat 146: 383-392. doi: 10.1007/s10549-014-3005-8
![]() |
[6] |
Farvid MS, Cho E, Chen WY, et al. (2014) Premenopausal dietary fat in relation to pre- and post-menopausal breast cancer. Breast Cancer Res Treat 145: 255-265. doi: 10.1007/s10549-014-2895-9
![]() |
[7] |
Sasaki S, Horacsek M, Kesteloot H (1993) An ecological study of the relationship between dietary fat intake and breast cancer mortality. Prev Med 22: 187-202. doi: 10.1006/pmed.1993.1016
![]() |
[8] |
Chlebowski RT, Blackburn GL, Thomson CA, et al. (2006) Dietary fat reduction and breast cancer outcome: interim efficacy results from the Women's Intervention Nutrition Study. J Natl Cancer Inst 98: 1767-1776. doi: 10.1093/jnci/djj494
![]() |
[9] | Chlebowski RT (2013) Nutrition and physical activity influence on breast cancer incidence and outcome. Breast 22 Suppl 2: S30-37. |
[10] |
Pierce JP, Natarajan L, Caan BJ, et al. (2007) Influence of a diet very high in vegetables, fruit, and fiber and low in fat on prognosis following treatment for breast cancer: the Women's Healthy Eating and Living (WHEL) randomized trial. JAMA 298: 289-298. doi: 10.1001/jama.298.3.289
![]() |
[11] | Sieri S, Chiodini P, Agnoli C, et al. (2014) Dietary fat intake and development of specific breast cancer subtypes. J Natl Cancer Inst 106. |
[12] |
Cao Y, Hou L, Wang W (2016) Dietary total fat and fatty acids intake, serum fatty acids and risk of breast cancer: A meta-analysis of prospective cohort studies. Int J Cancer 138: 1894-1904. doi: 10.1002/ijc.29938
![]() |
[13] |
Bagga D, Anders KH, Wang HJ, et al. (2002) Long-chain n-3-to-n-6 polyunsaturated fatty acid ratios in breast adipose tissue from women with and without breast cancer. Nutr Cancer 42: 180-185. doi: 10.1207/S15327914NC422_5
![]() |
[14] |
Murff HJ, Shu XO, Li H, et al. (2011) Dietary polyunsaturated fatty acids and breast cancer risk in Chinese women: a prospective cohort study. Int J Cancer 128: 1434-1441. doi: 10.1002/ijc.25703
![]() |
[15] |
Gago-Dominguez M, Yuan JM, Sun CL, et al. (2003) Opposing effects of dietary n-3 and n-6 fatty acids on mammary carcinogenesis: The Singapore Chinese Health Study. Br J Cancer 89: 1686-1692. doi: 10.1038/sj.bjc.6601340
![]() |
[16] |
Khankari NK, Bradshaw PT, Steck SE, et al. (2015) Dietary intake of fish, polyunsaturated fatty acids, and survival after breast cancer: A population-based follow-up study on Long Island, New York. Cancer 121: 2244-2252. doi: 10.1002/cncr.29329
![]() |
[17] |
Wakai K, Tamakoshi K, Date C, et al. (2005) Dietary intakes of fat and fatty acids and risk of breast cancer: a prospective study in Japan. Cancer Sci 96: 590-599. doi: 10.1111/j.1349-7006.2005.00084.x
![]() |
[18] |
Kiyabu GY, Inoue M, Saito E, et al. (2015) Fish, n - 3 polyunsaturated fatty acids and n - 6 polyunsaturated fatty acids intake and breast cancer risk: The Japan Public Health Center-based prospective study. Int J Cancer 137: 2915-2926. doi: 10.1002/ijc.29672
![]() |
[19] |
Bagga D, Wang L, Farias-Eisner R, et al. (2003) Differential effects of prostaglandin derived from omega-6 and omega-3 polyunsaturated fatty acids on COX-2 expression and IL-6 secretion. Proc Natl Acad Sci U S A 100: 1751-1756. doi: 10.1073/pnas.0334211100
![]() |
[20] | Chenais B, Blanckaert V (2012) The janus face of lipids in human breast cancer: how polyunsaturated Fatty acids affect tumor cell hallmarks. Int J Breast Cancer 2012: 712536. |
[21] |
Schwingshackl L, Hoffmann G (2016) Does a Mediterranean-Type Diet Reduce Cancer Risk? Curr Nutr Rep 5: 9-17. doi: 10.1007/s13668-015-0141-7
![]() |
[22] |
van den Brandt PA, Schulpen M (2017) Mediterranean diet adherence and risk of postmenopausal breast cancer: results of a cohort study and meta-analysis. Int J Cancer 140: 2220-2231. doi: 10.1002/ijc.30654
![]() |
[23] |
Pot GK, Stephen AM, Dahm CC, et al. (2014) Dietary patterns derived with multiple methods from food diaries and breast cancer risk in the UK Dietary Cohort Consortium. Eur J Clin Nutr 68: 1353-1358. doi: 10.1038/ejcn.2014.135
![]() |
[24] |
Toledo E, Salas-Salvado J, Donat-Vargas C, et al. (2015) Mediterranean Diet and Invasive Breast Cancer Risk Among Women at High Cardiovascular Risk in the PREDIMED Trial: A Randomized Clinical Trial. JAMA Intern Med 175: 1752-1760. doi: 10.1001/jamainternmed.2015.4838
![]() |
[25] | Farvid MS, Chen WY, Michels KB, et al. (2016) Fruit and vegetable consumption in adolescence and early adulthood and risk of breast cancer: population based cohort study. BMJ 353: i2343. |
[26] |
Liu Y, Colditz GA, Cotterchio M, et al. (2014) Adolescent dietary fiber, vegetable fat, vegetable protein, and nut intakes and breast cancer risk. Breast Cancer Res Treat 145: 461-470. doi: 10.1007/s10549-014-2953-3
![]() |
[27] | Dam MK, Hvidtfeldt UA, Tjonneland A, et al. (2016) Five year change in alcohol intake and risk of breast cancer and coronary heart disease among postmenopausal women: prospective cohort study. BMJ 353: i2314. |
[28] |
Dong JY, Qin LQ (2011) Dietary glycemic index, glycemic load, and risk of breast cancer: meta-analysis of prospective cohort studies. Breast Cancer Res Treat 126: 287-294. doi: 10.1007/s10549-011-1343-3
![]() |
[29] |
Inoue-Choi M, Sinha R, Gierach GL, et al. (2016) Red and processed meat, nitrite, and heme iron intakes and postmenopausal breast cancer risk in the NIH-AARP Diet and Health Study. Int J Cancer 138: 1609-1618. doi: 10.1002/ijc.29901
![]() |
[30] |
Touvier M, Fassier P, His M, et al. (2015) Cholesterol and breast cancer risk: a systematic review and meta-analysis of prospective studies. Br J Nutr 114: 347-357. doi: 10.1017/S000711451500183X
![]() |
[31] |
Tornberg SA, Holm LE, Carstensen JM (1988) Breast cancer risk in relation to serum cholesterol, serum beta-lipoprotein, height, weight, and blood pressure. Acta Oncol 27: 31-37. doi: 10.3109/02841868809090315
![]() |
[32] | Vatten LJ, Foss OP (1990) Total serum cholesterol and triglycerides and risk of breast cancer: a prospective study of 24,329 Norwegian women. Cancer Res 50: 2341-2346. |
[33] | Furberg AS, Jasienska G, Bjurstam N, et al. (2005) Metabolic and hormonal profiles: HDL cholesterol as a plausible biomarker of breast cancer risk. The Norwegian EBBA Study. Cancer Epidemiol Biomarkers Prev 14: 33-40. |
[34] |
Ha M, Sung J, Song YM (2009) Serum total cholesterol and the risk of breast cancer in postmenopausal Korean women. Cancer Causes Control 20: 1055-1060. doi: 10.1007/s10552-009-9301-7
![]() |
[35] |
Kitahara CM, Berrington de Gonzalez A, Freedman ND, et al. (2011) Total cholesterol and cancer risk in a large prospective study in Korea. J Clin Oncol 29: 1592-1598. doi: 10.1200/JCO.2010.31.5200
![]() |
[36] |
Li C, Yang L, Zhang D, et al. (2016) Systematic review and meta-analysis suggest that dietary cholesterol intake increases risk of breast cancer. Nutr Res 36: 627-635. doi: 10.1016/j.nutres.2016.04.009
![]() |
[37] |
McDonnell DP, Park S, Goulet MT, et al. (2014) Obesity, cholesterol metabolism, and breast cancer pathogenesis. Cancer Res 74: 4976-4982. doi: 10.1158/0008-5472.CAN-14-1756
![]() |
[38] |
Reeves GK, Pirie K, Beral V, et al. (2007) Cancer incidence and mortality in relation to body mass index in the Million Women Study: cohort study. BMJ 335: 1134. doi: 10.1136/bmj.39367.495995.AE
![]() |
[39] |
Renehan AG, Zwahlen M, Egger M (2015) Adiposity and cancer risk: new mechanistic insights from epidemiology. Nat Rev Cancer 15: 484-498. doi: 10.1038/nrc3967
![]() |
[40] |
Baumgartner KB, Hunt WC, Baumgartner RN, et al. (2004) Association of body composition and weight history with breast cancer prognostic markers: divergent pattern for Hispanic and non-Hispanic White women. Am J Epidemiol 160: 1087-1097. doi: 10.1093/aje/kwh313
![]() |
[41] |
Harris HR, Willett WC, Terry KL, et al. (2011) Body fat distribution and risk of premenopausal breast cancer in the Nurses' Health Study II. J Natl Cancer Inst 103: 273-278. doi: 10.1093/jnci/djq500
![]() |
[42] |
Jiralerspong S, Goodwin PJ (2016) Obesity and Breast Cancer Prognosis: Evidence, Challenges, and Opportunities. J Clin Oncol 34: 4203-4216. doi: 10.1200/JCO.2016.68.4480
![]() |
[43] |
Arce-Salinas C, Aguilar-Ponce JL, Villarreal-Garza C, et al. (2014) Overweight and obesity as poor prognostic factors in locally advanced breast cancer patients. Breast Cancer Res Treat 146: 183-188. doi: 10.1007/s10549-014-2977-8
![]() |
[44] |
Printz C (2014) Obesity associated with higher mortality in women with ER-positive breast cancer. Cancer 120: 3267. doi: 10.1002/cncr.29079
![]() |
[45] |
Ottobelli Chielle E, de Souza WM, da Silva TP, et al. (2016) Adipocytokines, inflammatory and oxidative stress markers of clinical relevance altered in young overweight/obese subjects. Clin Biochem 49: 548-553. doi: 10.1016/j.clinbiochem.2016.01.003
![]() |
[46] |
Bulun SE, Chen D, Moy I, et al. (2012) Aromatase, breast cancer and obesity: a complex interaction. Trends Endocrinol Metab 23: 83-89. doi: 10.1016/j.tem.2011.10.003
![]() |
[47] |
Creighton CJ, Sada YH, Zhang Y, et al. (2012) A gene transcription signature of obesity in breast cancer. Breast Cancer Res Treat 132: 993-1000. doi: 10.1007/s10549-011-1595-y
![]() |
[48] |
Perks CM, Holly JM (2011) Hormonal mechanisms underlying the relationship between obesity and breast cancer. Endocrinol Metab Clin North Am 40: 485-507, vii. doi: 10.1016/j.ecl.2011.05.010
![]() |
[49] |
Vadgama JV, Wu Y, Datta G, et al. (1999) Plasma insulin-like growth factor-I and serum IGF-binding protein 3 can be associated with the progression of breast cancer, and predict the risk of recurrence and the probability of survival in African-American and Hispanic women. Oncology 57: 330-340. doi: 10.1159/000012052
![]() |
[50] |
Kaaks R, Lundin E, Rinaldi S, et al. (2002) Prospective study of IGF-I, IGF-binding proteins, and breast cancer risk, in northern and southern Sweden. Cancer Causes Control 13: 307-316. doi: 10.1023/A:1015270324325
![]() |
[51] |
Sugumar A, Liu YC, Xia Q, et al. (2004) Insulin-like growth factor (IGF)-I and IGF-binding protein 3 and the risk of premenopausal breast cancer: a meta-analysis of literature. Int J Cancer 111: 293-297. doi: 10.1002/ijc.20253
![]() |
[52] |
Renehan AG, Harvie M, Howell A (2006) Insulin-like growth factor (IGF)-I, IGF binding protein-3, and breast cancer risk: eight years on. Endocr Relat Cancer 13: 273-278. doi: 10.1677/erc.1.01219
![]() |
[53] |
Endogenous H, Breast Cancer Collaborative G, Key TJ, et al. (2010) Insulin-like growth factor 1 (IGF1), IGF binding protein 3 (IGFBP3), and breast cancer risk: pooled individual data analysis of 17 prospective studies. Lancet Oncol 11: 530-542. doi: 10.1016/S1470-2045(10)70095-4
![]() |
[54] |
Rinaldi S, Peeters PH, Berrino F, et al. (2006) IGF-I, IGFBP-3 and breast cancer risk in women: The European Prospective Investigation into Cancer and Nutrition (EPIC). Endocr Relat Cancer 13: 593-605. doi: 10.1677/erc.1.01150
![]() |
[55] |
Allen NE, Roddam AW, Allen DS, et al. (2005) A prospective study of serum insulin-like growth factor-I (IGF-I), IGF-II, IGF-binding protein-3 and breast cancer risk. Br J Cancer 92: 1283-1287. doi: 10.1038/sj.bjc.6602471
![]() |
[56] | Rinaldi S, Kaaks R, Zeleniuch-Jacquotte A, et al. (2005) Insulin-like growth factor-I, IGF binding protein-3, and breast cancer in young women: a comparison of risk estimates using different peptide assays. Cancer Epidemiol Biomarkers Prev 14: 48-52. |
[57] |
Schairer C, Hill D, Sturgeon SR, et al. (2004) Serum concentrations of IGF-I, IGFBP-3 and c-peptide and risk of hyperplasia and cancer of the breast in postmenopausal women. Int J Cancer 108: 773-779. doi: 10.1002/ijc.11624
![]() |
[58] |
Signori C, DuBrock C, Richie JP, et al. (2012) Administration of omega-3 fatty acids and Raloxifene to women at high risk of breast cancer: interim feasibility and biomarkers analysis from a clinical trial. Eur J Clin Nutr 66: 878-884. doi: 10.1038/ejcn.2012.60
![]() |
[59] |
Kabat GC, Kim M, Caan BJ, et al. (2009) Repeated measures of serum glucose and insulin in relation to postmenopausal breast cancer. Int J Cancer 125: 2704-2710. doi: 10.1002/ijc.24609
![]() |
[60] | Goodwin PJ, Ennis M, Pritchard KI, et al. (2012) Insulin- and obesity-related variables in early-stage breast cancer: correlations and time course of prognostic associations. J Clin Oncol 30: 164-171. |
[61] | Sun W, Lu J, Wu S, et al. (2016) Association of insulin resistance with breast, ovarian, endometrial and cervical cancers in non-diabetic women. Am J Cancer Res 6: 2334-2344. |
[62] |
Lee JO, Kim N, Lee HJ, et al. (2016) Resistin, a fat-derived secretory factor, promotes metastasis of MDA-MB-231 human breast cancer cells through ERM activation. Sci Rep 6: 18923. doi: 10.1038/srep18923
![]() |
[63] |
Sun CA, Wu MH, Chu CH, et al. (2010) Adipocytokine resistin and breast cancer risk. Breast Cancer Res Treat 123: 869-876. doi: 10.1007/s10549-010-0792-4
![]() |
[64] | Dalamaga M, Sotiropoulos G, Karmaniolas K, et al. (2013) Serum resistin: a biomarker of breast cancer in postmenopausal women? Association with clinicopathological characteristics, tumor markers, inflammatory and metabolic parameters. Clin Biochem 46: 584-590. |
[65] |
Georgiou GP, Provatopoulou X, Kalogera E, et al. (2016) Serum resistin is inversely related to breast cancer risk in premenopausal women. Breast 29: 163-169. doi: 10.1016/j.breast.2016.07.025
![]() |
[66] |
Rahbar AR, Nabipour I (2014) The relationship between dietary lipids and serum visfatin and adiponectin levels in postmenopausal women. Endocr Metab Immune Disord Drug Targets 14: 84-92. doi: 10.2174/1871530314666140527143009
![]() |
[67] |
Dalamaga M, Karmaniolas K, Papadavid E, et al. (2011) Elevated serum visfatin/nicotinamide phosphoribosyl-transferase levels are associated with risk of postmenopausal breast cancer independently from adiponectin, leptin, and anthropometric and metabolic parameters. Menopause 18: 1198-1204. doi: 10.1097/gme.0b013e31821e21f5
![]() |
[68] |
Lee YC, Yang YH, Su JH, et al. (2011) High visfatin expression in breast cancer tissue is associated with poor survival. Cancer Epidemiol Biomarkers Prev 20: 1892-1901. doi: 10.1158/1055-9965.EPI-11-0399
![]() |
[69] |
Kim SR, Park HJ, Bae YH, et al. (2012) Curcumin down-regulates visfatin expression and inhibits breast cancer cell invasion. Endocrinology 153: 554-563. doi: 10.1210/en.2011-1413
![]() |
[70] |
Li C, Li J, Chen Y, et al. (2016) Effect of curcumin on visfatin and zinc-alpha2-glycoprotein in a rat model of non-alcoholic fatty liver disease. Acta Cir Bras 31: 706-713. doi: 10.1590/s0102-865020160110000001
![]() |
[71] |
Gao Y, Wang C, Pan T, et al. (2014) Impact of metformin treatment and swimming exercise on visfatin levels in high-fat-induced obesity rats. Arq Bras Endocrinol Metabol 58: 42-47. doi: 10.1590/0004-2730000002840
![]() |
[72] |
Maffei M, Halaas J, Ravussin E, et al. (1995) Leptin levels in human and rodent: measurement of plasma leptin and ob RNA in obese and weight-reduced subjects. Nat Med 1: 1155-1161. doi: 10.1038/nm1195-1155
![]() |
[73] |
Thompson HJ, Sedlacek SM, Wolfe P, et al. (2015) Impact of Weight Loss on Plasma Leptin and Adiponectin in Overweight-to-Obese Post Menopausal Breast Cancer Survivors. Nutrients 7: 5156-5176. doi: 10.3390/nu7075156
![]() |
[74] |
Alshaker H, Wang Q, Frampton AE, et al. (2015) Sphingosine kinase 1 contributes to leptin-induced STAT3 phosphorylation through IL-6/gp130 transactivation in oestrogen receptor-negative breast cancer. Breast Cancer Res Treat 149: 59-67. doi: 10.1007/s10549-014-3228-8
![]() |
[75] |
Park HS, Park JY, Yu R (2005) Relationship of obesity and visceral adiposity with serum concentrations of CRP, TNF-alpha and IL-6. Diabetes Res Clin Pract 69: 29-35. doi: 10.1016/j.diabres.2004.11.007
![]() |
[76] |
Bowers LW, Brenner AJ, Hursting SD, et al. (2015) Obesity-associated systemic interleukin-6 promotes pre-adipocyte aromatase expression via increased breast cancer cell prostaglandin E2 production. Breast Cancer Res Treat 149: 49-57. doi: 10.1007/s10549-014-3223-0
![]() |
[77] |
Imayama I, Ulrich CM, Alfano CM, et al. (2012) Effects of a caloric restriction weight loss diet and exercise on inflammatory biomarkers in overweight/obese postmenopausal women: a randomized controlled trial. Cancer Res 72: 2314-2326. doi: 10.1158/0008-5472.CAN-11-3092
![]() |
[78] | Gao D, Rahbar R, Fish EN (2016) CCL5 activation of CCR5 regulates cell metabolism to enhance proliferation of breast cancer cells. Open Biol 6. |
[79] |
Svensson S, Abrahamsson A, Rodriguez GV, et al. (2015) CCL2 and CCL5 Are Novel Therapeutic Targets for Estrogen-Dependent Breast Cancer. Clin Cancer Res 21: 3794-3805. doi: 10.1158/1078-0432.CCR-15-0204
![]() |
[80] |
Yaal-Hahoshen N, Shina S, Leider-Trejo L, et al. (2006) The chemokine CCL5 as a potential prognostic factor predicting disease progression in stage II breast cancer patients. Clin Cancer Res 12: 4474-4480. doi: 10.1158/1078-0432.CCR-06-0074
![]() |
[81] | D'Esposito V, Liguoro D, Ambrosio MR, et al. (2016) Adipose microenvironment promotes triple negative breast cancer cell invasiveness and dissemination by producing CCL5. Oncotarget 7: 24495-24509. |
[82] |
Li BH, He FP, Yang X, et al. (2017) Steatosis induced CCL5 contributes to early-stage liver fibrosis in nonalcoholic fatty liver disease progress. Transl Res 180: 103-117 e104. doi: 10.1016/j.trsl.2016.08.006
![]() |
[83] |
Bell LN, Ward JL, Degawa-Yamauchi M, et al. (2006) Adipose tissue production of hepatocyte growth factor contributes to elevated serum HGF in obesity. Am J Physiol Endocrinol Metab 291: E843-848. doi: 10.1152/ajpendo.00174.2006
![]() |
[84] |
Sundaram S, Freemerman AJ, Johnson AR, et al. (2013) Role of HGF in obesity-associated tumorigenesis: C3(1)-TAg mice as a model for human basal-like breast cancer. Breast Cancer Res Treat 142: 489-503. doi: 10.1007/s10549-013-2741-5
![]() |
[85] | Ristimaki A, Sivula A, Lundin J, et al. (2002) Prognostic significance of elevated cyclooxygenase-2 expression in breast cancer. Cancer Res 62: 632-635. |
[86] |
Khuder SA, Mutgi AB (2001) Breast cancer and NSAID use: a meta-analysis. Br J Cancer 84: 1188-1192. doi: 10.1054/bjoc.2000.1709
![]() |
[87] | Brown KA, Simpson ER (2012) Obesity and breast cancer: mechanisms and therapeutic implications. Front Biosci (Elite Ed) 4: 2515-2524. |
[88] | Davies NJ, Batehup L, Thomas R (2011) The role of diet and physical activity in breast, colorectal, and prostate cancer survivorship: a review of the literature. Br J Cancer 105 Suppl 1: S52-73. |
[89] |
Cuzick J, Warwick J, Pinney E, et al. (2011) Tamoxifen-induced reduction in mammographic density and breast cancer risk reduction: a nested case-control study. J Natl Cancer Inst 103: 744-752. doi: 10.1093/jnci/djr079
![]() |
[90] |
Kaaks R, Bellati C, Venturelli E, et al. (2003) Effects of dietary intervention on IGF-I and IGF-binding proteins, and related alterations in sex steroid metabolism: the Diet and Androgens (DIANA) Randomised Trial. Eur J Clin Nutr 57: 1079-1088. doi: 10.1038/sj.ejcn.1601647
![]() |
[91] | Wei M, Brandhorst S, Shelehchi M, et al. (2017) Fasting-mimicking diet and markers/risk factors for aging, diabetes, cancer, and cardiovascular disease. Sci Transl Med 9. |
[92] |
Sandhu N, Schetter SE, Liao J, et al. (2016) Influence of Obesity on Breast Density Reduction by Omega-3 Fatty Acids: Evidence from a Randomized Clinical Trial. Cancer Prev Res (Phila) 9: 275-282. doi: 10.1158/1940-6207.CAPR-15-0235
![]() |
[93] |
Sanderson M, Peltz G, Perez A, et al. (2010) Diabetes, physical activity and breast cancer among Hispanic women. Cancer Epidemiol 34: 556-561. doi: 10.1016/j.canep.2010.06.001
![]() |
[94] |
Irwin ML, Varma K, Alvarez-Reeves M, et al. (2009) Randomized controlled trial of aerobic exercise on insulin and insulin-like growth factors in breast cancer survivors: the Yale Exercise and Survivorship study. Cancer Epidemiol Biomarkers Prev 18: 306-313. doi: 10.1158/1055-9965.EPI-08-0531
![]() |
Camera device model | Appearance | Size (mm) | Weight (g) | Imaging component (inches) | Compression rate (bps) |
Dahua DH-IPC-HDBW2230R-AS | Dome | φ122 × 89 | 406 | CMOS, 1/2.7 | 6K-8 M |
Dahua DH-IPC-HDBW4636R-AS | Dome | φ122 × 89 | 444 | CMOS, 1/2.9 | 73K-10 M |
Dahua DH-IPC-HDBW4833R-ZAS | Dome | φ122 × 88.9 | 494 | CMOS, 1/1.8 | 16K-14.75 M |
Dahua DH-IPC-HFW4433F-ZAS | Bullet | 186 × 87 × 85 | 772 | CMOS, 1/3.0 | 8K-10 M |
Hikvision DS-2CD1311D | Dome | φ127 × 97.5 | 570 | CMOS, 1/3.0 | 32K-16 M |
Hikvision DS-2CD3345P1 | Dome | φ127.3 × 103.7 | 340 | CMOS, 1/3.0 | 32K-16 M |
Hikvision DS-2CD3646F | Barrel | 191.4 × 97.9 × 93.5 | 1260 | CMOS, 1/2.7 | 32K-8 M |
Hikvision DS-2CD3726F | Dome | 153.3 × 153.3 × 111.6 | 840 | CMOS, 1/2.7 | 32K-8 M |
No. | Device model | No. | Device model |
T01 | Dahua DH-IPC-HDBW1020R | T21 | Hikvision DC-2CD2T45 |
T02 | Dahua DH-IPC-HDBW2130R-AS | T22 | Hikvision DS-2CD1225 |
T03 | Dahua DH-IPC-HDBW2230R-AS | T23 | Hikvision DS-2CD1311D |
T04 | Dahua DH-IPC-HDBW4636R-AS | T24 | Hikvision DS-2CD3310F-I |
T05 | Dahua DH-IPC-HDBW4833R-ZAS | T25 | Hikvision DS-2CD3345P1 |
T06 | Dahua DH-IPC-HDPW4233-PT-SA-0360B | T26 | Hikvision DS-2CD3410FD-IW |
T07 | Dahua DH-IPC-HFW4433F-ZAS | T27 | Hikvision DS-2CD3646F |
T08 | Dahua DH-IPC-HFW4631K-AS | T28 | Hikvision DS-2CD3726F |
T09 | Dahua DH-IPC-HFW8841K-ZRL-DS | T29 | Hikvision DS-2CD3935FWD-IWS |
T10 | Dahua DH-IPC-PDBW4638-B270 | T30 | Hikvision DS-2CD3942F-I |
T11 | Dahua DH-CA-DW48 | T31 | Hikvision DS-2CD3A20F-IS |
T12 | Dahua DH-CA-FW19M-IR8 | T32 | Hikvision DS-2CD3T25-I5 |
T13 | Dahua DH-HAC-HDW2208E | T33 | Hikvision DS-2CD3T56WD-I5 |
T14 | Dahua DH-IPC-EW4431-ASW | T34 | Hikvision DS-IPC-E22H-IW |
T15 | Dahua DH-IPC-HDBW4243E-ZFD | T35 | Hikvision DS-IPC-T12-I |
T16 | ACTi ACM-3001 | T36 | TP-LINK TL-IPC223 |
T17 | ACTi ACM-5001 | T37 | TP-LINK TL-IPC313K-W10 |
T18 | ACTi ACM-3511 | T38 | TP-LINK TL-IPC43KZ |
T19 | ACTi ACM-5601 | T39 | TP-LINK TL-IPC546H |
T20 | ACTi ACM-7411 | T40 | TP-LINK TL-IPC646P-A4 |
No. | f1 | f2 (mm) |
f3 (g) | f4 (inch) | f5 | f6 (Lux) | f7 | f8 (bps) | f9 (s) | f10 (℃) |
T01 | Dome | 122×89 | 400 | CMOS, 1/4.0 | {704×576;1280×720} | 0.4;0.22; | {6, F2.6, 37;3.6, F2.5, 59;2.8, F2.5, 70.5} | 5K- > 5M | 1/3- > 1/10-5 | -30- > 60 |
T02 | Dome | 122×89 | 462 | CMOS, 1/3.0 | {704×576;1280×960;1280×720;704×480} | 0.01;0.001;0 | {8, F2.5, 35.5;6, F2.1, 47;3.6, F2.0, 72.2;2.8, F2.0, 91.7} | 12K- > 6M | 1/3- > 1/10-5 | -40- > 60 |
T03 | Dome | 122×89 | 406 | CMOS, 1/2.7 | {704×576;1920×1080;704×480} | 0.01;0.001;0 | {8, F2.2, 42;6, F2.0, 55;3.6, F2.0, 90;2.8, F2.0,115} | 6K- > 8M | 1/3- > 1/10-5 | -40- > 60 |
T04 | Dome | 122×89 | 444 | CMOS, 1/2.9 | {2688×1520;704×576;2592×1944;1280×720;3072×2048;704×480;2560×1440} | 0.002;0.0002;0 | {6, F2.5, 47.34;3.6, F2.2, 70;2.8, F2.0, 99} | 73K- > 10M | 1/3- > 1/10-5 | -30- > 60 |
T05 | Dome | 122×88.9 | 494 | CMOS, 1/1.8 | {704×576;1920×1080;3840×2160;704×480} | 0.002;0.0002;0 | {12, F2.8, 45.5;3.5, F1.9,110} | 16K- > 14.75M | 1/3- > 1/10-5 | -40- > 55 |
T06 | Dome | 124.6×82.3 | 265 | CMOS, 1/2.8 | {704×576;1920×1080;1920×10801;704×480} | 0.002;0.0002;0 | {3.6, F1.6, 87} | 12K- > 10M | 1/3- > 1/10-5 | -20- > 50 |
T07 | Bullet | 186×87×85 | 772 | CMOS, 1/3.0 | {704×576;1280×720;704×480;2592×1520;2560×1440} | 0.001;0.0001;0 | {13.5, F3.13, 27.7;2.7, F1.6,104.2} | 8K- > 10M | 1/3- > 1/10-5 | -30- > 60 |
T08 | Bullet | 194×96×89 | 640 | CMOS, 1/2.9 | {2688×1520;704×576;2592×1944;1280×720;3072×2048;704×480;2560×1440} | 0.002;0.0002;0 | {6, F2.5, 47.34;3.6, F2.2, 70;2.8, F2.0, 99} | 73K- > 10M | 1/3- > 1/10-5 | -40- > 60 |
T09 | Bullet | 221×109×101 | 1080 | CMOS, 1/1.8 | {704×576;1920×1080;3840×2160;1704×576;704×480} | 0.01;0.001;0 | {2.7, F1.2,111} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T10 | Dome | 285.1×100.8 | 2600 | CMOS, 1/2.8 | {704×576;1920×1080;1280×720;1280×960;704×480} | 0.002;0.0002;0 | {12, F2.7, 44;2.7, F1.8,105} | 16K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T11 | Dome | 113.6×85.4 | 300 | CCD, 1/1.3 | {976×582} | 0.001;;0 | {2.8, F1.2, 0} | 16K- > 8M | 1/50- > 1/10-5 | -30- > 60 |
T12 | Bullet | 194.4×96.6×89.5 | 400 | CMOS, 1/3.0 | {1280×720} | 0.01;; | {3.6, F1.2, 0} | 16K- > 8M | 1/25- > 1/10-5 | -30- > 60 |
T13 | Dome | 110×95 | 440 | CMOS, 1/2.8 | {1920×1080;1280×720} | 0.001;; | {3.6, F1.2, 0} | 16K- > 8M | 1/1- > 1/30000 | -30- > 60 |
T14 | Dome | 126×37.7 | 500 | CMOS, 1/3.0 | {2688×1520;704×576;1920×1080;1280×720} | 0.001;; | {1.6, F1.4,180} | 14K- > 40M | 1/3- > 1/10-5 | -10- > 50 |
T15 | Dome | 159.1×117.9 | 925 | CMOS, 1/2.8 | {704×576;1920×1080;704×480} | 0.002;0.0002;0 | {35, F1.8, 13;13.5, F3.05, 31;7, F1.4, 36;2.7, F1.6,110} | 16K- > 8M | 1/3- > 1/10-5 | -40- > 60 |
T16 | Dome | 130×99 | 350 | CMOS, 1/3.0 | {320×240;160×120;640×480} | ; ; | {4.2, F1.8, 0} | 1/5- > 1/2000 | 0- > 40 | |
T17 | Bullet | 67×55 | 400 | CMOS, 1/3.0 | {320×240;160×120;640×480} | ; ; | {4.2, F1.8, 0} | 1/5- > 1/2000 | 0- > 50 | |
T18 | Dome | 130×99 | 380 | CMOS, 1/3.0 | {320×240;160×112;640×480;1280×1024;1280×720} | ; ; | {3.3, F1.6, 0} | 1/5- > 1/2000 | -10- > 45 | |
T19 | Bullet | 67×55×129.5 | 400 | CMOS, 1/3.0 | {640×480;1280×720;1280×1024} | ; ; | {4.2, F1.8, 0} | 1/5- > 1/2000 | 0- > 50 | |
T20 | Dome | 151.69×114.9 | 1040 | CMOS, 1/3.0 | {640×480;1280×720;1280×1024} | ; ; | {3.3, F1.4, 0} | 1/10- > 1/2000 | -30- > 50 | |
T21 | Barrel | 90×85×169 | 550 | CMOS, 1/3.0 | {2560×1440} | 0.1;; | {1.68, F2.0,180} | 32K- > 16M | 1/3- > 1/10-5 | -30- > 60 |
T22 | Bullet | 100.5×88.1×157.3 | 600 | CMOS, 1/4.0 | {1280×720} | 0.01;; | {12, F1.2, 26;8, F1.2, 39;6, F1.2, 50;4, F1.2, 76;2.8, F1.2, 95} | 32K- > 8M | 1/25- > 1/10-5 | -30- > 60 |
T23 | Dome | 127×97.5 | 570 | CMOS, 1/3.0 | {1280×960} | 0.01;; | {16, F1.2, 18;12, F1.2, 22;8, F1.2, 33;6, F1.2, 50;4, F1.2, 69;2.8, F1.2, 90} | 32K- > 16M | 1/25- > 1/10-5 | -30- > 60 |
T24 | Dome | 114.6×89.4 | 670 | CMOS, 1/3.0 | {1280×960} | 0.01;; | {4, F2.0, 75.8} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T25 | Dome | 127.3×103.7 | 340 | CMOS, 1/3.0 | {2560×1440} | 0.01;; | {4, F1.2, 69;2.8, F1.2, 90} | 32K- > 16M | 1/3- > 1/10-5 | -10- > 40 |
T26 | Bullet | 66×139.1×70.6 | 400 | CCD, 1/3.0 | {1280×920} | 0.02;; | {4, F2.0, 75.8} | 32K- > 16M | 1/25- > 1/10-5 | -25- > 60 |
T27 | Barrel | 191.4×97.9×93.5 | 1260 | CMOS, 1/2.7 | {2560×1440} | 0.005;; | {2.8, F1.2,105} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T28 | Barrel | 153.3×153.3×111.6 | 840 | CMOS, 1/2.7 | {1920×1080} | 0.002;; | {2.7, F1.2,103} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T29 | Dome | 119.9×41.2 | 600 | CMOS, 1/2.8 | {1600×1200;1280×960;2048×1536} | 0.005;;0 | {1.16, F2.2,180} | 32K- > 16M | 1/3- > 1/10-5 | -10- > 40 |
T30 | Dome | 119.9×41.2 | 600 | CMOS, 1/3.0 | {2048×1536} | 0.01;;0 | {1.6, F1.6,186} | 32K- > 8M | 1/25- > 1/10-5 | -10- > 40 |
T31 | Barrel | 134×116.1×293 | 2000 | CMOS, 1/2.8 | {1920×1080} | 0.01;;0 | {25, F1.2, 12.4;16, F1.2, 19.2;12, F1.2, 24.6;8, F1.2, 40;6, F1.2, 52;4, F1.2, 85} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T32 | Bullet | 194.04×93.85×89.52 | 1000 | CMOS, 1/2.7 | {1920×1080;1280×960;1296×732;1280×720} | 0.01;; | {4, F1.2, 80} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T33 | Bullet | 93.85×93.52×194.1 | 690 | CMOS, 1/2.7 | {704×576;2560×1920;640×480;1280×720;2560×1536;1920×1280;352×288} | 0.005;; | {6, F1.2, 55;4, F1.2, 83} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T34 | Barrel | 175×89×75 | 340 | CMOS, 1/2.7 | {1920×1080} | 0.01;; | {8, F1.2, 43;6, F1.2, 54.4;4, F1.2, 89.1;2.8, F1.2,114.8} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 50 |
T35 | Dome | 110×93.2 | 350 | CMOS, 1/2.7 | {1920×1080} | 0.01;; | {8, F1.2, 43;6, F1.2, 54.5;4, F1.2, 91;2.8, F1.2,114.8} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T36 | Dome | 113×113×87 | 268 | CMOS, 1/2.7 | {1920×1080} | 0.1;0.1;0 | {4, F2.1, 0;6, F2.1, 0} | 64K- > 8M | 1/25- > 1/10-5 | -30- > 60 |
T37 | Barrel | 173×83.4×84.2 | 390 | CMOS, 1/3.0 | {1280×960} | 0.1;0.1;0 | {2.8, F2.1, 0;4, F2.1, 0;6, F2.1, 0} | 64K- > 4M | 1/25- > 1/10-5 | -10- > 60 |
T38 | Dome | 120×120×129 | 328 | CMOS, 1/2.7 | {2304×1296} | 0.1;;0 | {3.35, F2.4, 0} | 64K- > 2M | 1/25- > 1/10-5 | -30- > 50 |
T39 | Barrel | 173×83.4×84.2 | 390 | CMOS, 1/2.7 | {2560×1440} | 0.01;; | {4, F1.6, 0;6, F1.6, 0;8, F1.6, 0;12, F1.6, 0} | 256K- > 6M | 1/25- > 1/10-5 | -30- > 60 |
T40 | Ball | 230×177×188 | 930 | CMOS, 1/3.0 | {2560×1440} | 0.002;;0 | {4, F1.6, 0} | 64K- > 6M | 1/25- > 1/10-5 | -30- > 60 |
Camera device model | Appearance | Size (mm) | Weight (g) | Imaging component (inches) | Compression rate (bps) |
Dahua DH-IPC-HDBW2230R-AS | Dome | φ122 × 89 | 406 | CMOS, 1/2.7 | 6K-8 M |
Dahua DH-IPC-HDBW4636R-AS | Dome | φ122 × 89 | 444 | CMOS, 1/2.9 | 73K-10 M |
Dahua DH-IPC-HDBW4833R-ZAS | Dome | φ122 × 88.9 | 494 | CMOS, 1/1.8 | 16K-14.75 M |
Dahua DH-IPC-HFW4433F-ZAS | Bullet | 186 × 87 × 85 | 772 | CMOS, 1/3.0 | 8K-10 M |
Hikvision DS-2CD1311D | Dome | φ127 × 97.5 | 570 | CMOS, 1/3.0 | 32K-16 M |
Hikvision DS-2CD3345P1 | Dome | φ127.3 × 103.7 | 340 | CMOS, 1/3.0 | 32K-16 M |
Hikvision DS-2CD3646F | Barrel | 191.4 × 97.9 × 93.5 | 1260 | CMOS, 1/2.7 | 32K-8 M |
Hikvision DS-2CD3726F | Dome | 153.3 × 153.3 × 111.6 | 840 | CMOS, 1/2.7 | 32K-8 M |
No. | Device model | No. | Device model |
T01 | Dahua DH-IPC-HDBW1020R | T21 | Hikvision DC-2CD2T45 |
T02 | Dahua DH-IPC-HDBW2130R-AS | T22 | Hikvision DS-2CD1225 |
T03 | Dahua DH-IPC-HDBW2230R-AS | T23 | Hikvision DS-2CD1311D |
T04 | Dahua DH-IPC-HDBW4636R-AS | T24 | Hikvision DS-2CD3310F-I |
T05 | Dahua DH-IPC-HDBW4833R-ZAS | T25 | Hikvision DS-2CD3345P1 |
T06 | Dahua DH-IPC-HDPW4233-PT-SA-0360B | T26 | Hikvision DS-2CD3410FD-IW |
T07 | Dahua DH-IPC-HFW4433F-ZAS | T27 | Hikvision DS-2CD3646F |
T08 | Dahua DH-IPC-HFW4631K-AS | T28 | Hikvision DS-2CD3726F |
T09 | Dahua DH-IPC-HFW8841K-ZRL-DS | T29 | Hikvision DS-2CD3935FWD-IWS |
T10 | Dahua DH-IPC-PDBW4638-B270 | T30 | Hikvision DS-2CD3942F-I |
T11 | Dahua DH-CA-DW48 | T31 | Hikvision DS-2CD3A20F-IS |
T12 | Dahua DH-CA-FW19M-IR8 | T32 | Hikvision DS-2CD3T25-I5 |
T13 | Dahua DH-HAC-HDW2208E | T33 | Hikvision DS-2CD3T56WD-I5 |
T14 | Dahua DH-IPC-EW4431-ASW | T34 | Hikvision DS-IPC-E22H-IW |
T15 | Dahua DH-IPC-HDBW4243E-ZFD | T35 | Hikvision DS-IPC-T12-I |
T16 | ACTi ACM-3001 | T36 | TP-LINK TL-IPC223 |
T17 | ACTi ACM-5001 | T37 | TP-LINK TL-IPC313K-W10 |
T18 | ACTi ACM-3511 | T38 | TP-LINK TL-IPC43KZ |
T19 | ACTi ACM-5601 | T39 | TP-LINK TL-IPC546H |
T20 | ACTi ACM-7411 | T40 | TP-LINK TL-IPC646P-A4 |
No. | f1 | f2 (mm) |
f3 (g) | f4 (inch) | f5 | f6 (Lux) | f7 | f8 (bps) | f9 (s) | f10 (℃) |
T01 | Dome | 122×89 | 400 | CMOS, 1/4.0 | {704×576;1280×720} | 0.4;0.22; | {6, F2.6, 37;3.6, F2.5, 59;2.8, F2.5, 70.5} | 5K- > 5M | 1/3- > 1/10-5 | -30- > 60 |
T02 | Dome | 122×89 | 462 | CMOS, 1/3.0 | {704×576;1280×960;1280×720;704×480} | 0.01;0.001;0 | {8, F2.5, 35.5;6, F2.1, 47;3.6, F2.0, 72.2;2.8, F2.0, 91.7} | 12K- > 6M | 1/3- > 1/10-5 | -40- > 60 |
T03 | Dome | 122×89 | 406 | CMOS, 1/2.7 | {704×576;1920×1080;704×480} | 0.01;0.001;0 | {8, F2.2, 42;6, F2.0, 55;3.6, F2.0, 90;2.8, F2.0,115} | 6K- > 8M | 1/3- > 1/10-5 | -40- > 60 |
T04 | Dome | 122×89 | 444 | CMOS, 1/2.9 | {2688×1520;704×576;2592×1944;1280×720;3072×2048;704×480;2560×1440} | 0.002;0.0002;0 | {6, F2.5, 47.34;3.6, F2.2, 70;2.8, F2.0, 99} | 73K- > 10M | 1/3- > 1/10-5 | -30- > 60 |
T05 | Dome | 122×88.9 | 494 | CMOS, 1/1.8 | {704×576;1920×1080;3840×2160;704×480} | 0.002;0.0002;0 | {12, F2.8, 45.5;3.5, F1.9,110} | 16K- > 14.75M | 1/3- > 1/10-5 | -40- > 55 |
T06 | Dome | 124.6×82.3 | 265 | CMOS, 1/2.8 | {704×576;1920×1080;1920×10801;704×480} | 0.002;0.0002;0 | {3.6, F1.6, 87} | 12K- > 10M | 1/3- > 1/10-5 | -20- > 50 |
T07 | Bullet | 186×87×85 | 772 | CMOS, 1/3.0 | {704×576;1280×720;704×480;2592×1520;2560×1440} | 0.001;0.0001;0 | {13.5, F3.13, 27.7;2.7, F1.6,104.2} | 8K- > 10M | 1/3- > 1/10-5 | -30- > 60 |
T08 | Bullet | 194×96×89 | 640 | CMOS, 1/2.9 | {2688×1520;704×576;2592×1944;1280×720;3072×2048;704×480;2560×1440} | 0.002;0.0002;0 | {6, F2.5, 47.34;3.6, F2.2, 70;2.8, F2.0, 99} | 73K- > 10M | 1/3- > 1/10-5 | -40- > 60 |
T09 | Bullet | 221×109×101 | 1080 | CMOS, 1/1.8 | {704×576;1920×1080;3840×2160;1704×576;704×480} | 0.01;0.001;0 | {2.7, F1.2,111} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T10 | Dome | 285.1×100.8 | 2600 | CMOS, 1/2.8 | {704×576;1920×1080;1280×720;1280×960;704×480} | 0.002;0.0002;0 | {12, F2.7, 44;2.7, F1.8,105} | 16K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T11 | Dome | 113.6×85.4 | 300 | CCD, 1/1.3 | {976×582} | 0.001;;0 | {2.8, F1.2, 0} | 16K- > 8M | 1/50- > 1/10-5 | -30- > 60 |
T12 | Bullet | 194.4×96.6×89.5 | 400 | CMOS, 1/3.0 | {1280×720} | 0.01;; | {3.6, F1.2, 0} | 16K- > 8M | 1/25- > 1/10-5 | -30- > 60 |
T13 | Dome | 110×95 | 440 | CMOS, 1/2.8 | {1920×1080;1280×720} | 0.001;; | {3.6, F1.2, 0} | 16K- > 8M | 1/1- > 1/30000 | -30- > 60 |
T14 | Dome | 126×37.7 | 500 | CMOS, 1/3.0 | {2688×1520;704×576;1920×1080;1280×720} | 0.001;; | {1.6, F1.4,180} | 14K- > 40M | 1/3- > 1/10-5 | -10- > 50 |
T15 | Dome | 159.1×117.9 | 925 | CMOS, 1/2.8 | {704×576;1920×1080;704×480} | 0.002;0.0002;0 | {35, F1.8, 13;13.5, F3.05, 31;7, F1.4, 36;2.7, F1.6,110} | 16K- > 8M | 1/3- > 1/10-5 | -40- > 60 |
T16 | Dome | 130×99 | 350 | CMOS, 1/3.0 | {320×240;160×120;640×480} | ; ; | {4.2, F1.8, 0} | 1/5- > 1/2000 | 0- > 40 | |
T17 | Bullet | 67×55 | 400 | CMOS, 1/3.0 | {320×240;160×120;640×480} | ; ; | {4.2, F1.8, 0} | 1/5- > 1/2000 | 0- > 50 | |
T18 | Dome | 130×99 | 380 | CMOS, 1/3.0 | {320×240;160×112;640×480;1280×1024;1280×720} | ; ; | {3.3, F1.6, 0} | 1/5- > 1/2000 | -10- > 45 | |
T19 | Bullet | 67×55×129.5 | 400 | CMOS, 1/3.0 | {640×480;1280×720;1280×1024} | ; ; | {4.2, F1.8, 0} | 1/5- > 1/2000 | 0- > 50 | |
T20 | Dome | 151.69×114.9 | 1040 | CMOS, 1/3.0 | {640×480;1280×720;1280×1024} | ; ; | {3.3, F1.4, 0} | 1/10- > 1/2000 | -30- > 50 | |
T21 | Barrel | 90×85×169 | 550 | CMOS, 1/3.0 | {2560×1440} | 0.1;; | {1.68, F2.0,180} | 32K- > 16M | 1/3- > 1/10-5 | -30- > 60 |
T22 | Bullet | 100.5×88.1×157.3 | 600 | CMOS, 1/4.0 | {1280×720} | 0.01;; | {12, F1.2, 26;8, F1.2, 39;6, F1.2, 50;4, F1.2, 76;2.8, F1.2, 95} | 32K- > 8M | 1/25- > 1/10-5 | -30- > 60 |
T23 | Dome | 127×97.5 | 570 | CMOS, 1/3.0 | {1280×960} | 0.01;; | {16, F1.2, 18;12, F1.2, 22;8, F1.2, 33;6, F1.2, 50;4, F1.2, 69;2.8, F1.2, 90} | 32K- > 16M | 1/25- > 1/10-5 | -30- > 60 |
T24 | Dome | 114.6×89.4 | 670 | CMOS, 1/3.0 | {1280×960} | 0.01;; | {4, F2.0, 75.8} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T25 | Dome | 127.3×103.7 | 340 | CMOS, 1/3.0 | {2560×1440} | 0.01;; | {4, F1.2, 69;2.8, F1.2, 90} | 32K- > 16M | 1/3- > 1/10-5 | -10- > 40 |
T26 | Bullet | 66×139.1×70.6 | 400 | CCD, 1/3.0 | {1280×920} | 0.02;; | {4, F2.0, 75.8} | 32K- > 16M | 1/25- > 1/10-5 | -25- > 60 |
T27 | Barrel | 191.4×97.9×93.5 | 1260 | CMOS, 1/2.7 | {2560×1440} | 0.005;; | {2.8, F1.2,105} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T28 | Barrel | 153.3×153.3×111.6 | 840 | CMOS, 1/2.7 | {1920×1080} | 0.002;; | {2.7, F1.2,103} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T29 | Dome | 119.9×41.2 | 600 | CMOS, 1/2.8 | {1600×1200;1280×960;2048×1536} | 0.005;;0 | {1.16, F2.2,180} | 32K- > 16M | 1/3- > 1/10-5 | -10- > 40 |
T30 | Dome | 119.9×41.2 | 600 | CMOS, 1/3.0 | {2048×1536} | 0.01;;0 | {1.6, F1.6,186} | 32K- > 8M | 1/25- > 1/10-5 | -10- > 40 |
T31 | Barrel | 134×116.1×293 | 2000 | CMOS, 1/2.8 | {1920×1080} | 0.01;;0 | {25, F1.2, 12.4;16, F1.2, 19.2;12, F1.2, 24.6;8, F1.2, 40;6, F1.2, 52;4, F1.2, 85} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T32 | Bullet | 194.04×93.85×89.52 | 1000 | CMOS, 1/2.7 | {1920×1080;1280×960;1296×732;1280×720} | 0.01;; | {4, F1.2, 80} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T33 | Bullet | 93.85×93.52×194.1 | 690 | CMOS, 1/2.7 | {704×576;2560×1920;640×480;1280×720;2560×1536;1920×1280;352×288} | 0.005;; | {6, F1.2, 55;4, F1.2, 83} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T34 | Barrel | 175×89×75 | 340 | CMOS, 1/2.7 | {1920×1080} | 0.01;; | {8, F1.2, 43;6, F1.2, 54.4;4, F1.2, 89.1;2.8, F1.2,114.8} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 50 |
T35 | Dome | 110×93.2 | 350 | CMOS, 1/2.7 | {1920×1080} | 0.01;; | {8, F1.2, 43;6, F1.2, 54.5;4, F1.2, 91;2.8, F1.2,114.8} | 32K- > 8M | 1/3- > 1/10-5 | -30- > 60 |
T36 | Dome | 113×113×87 | 268 | CMOS, 1/2.7 | {1920×1080} | 0.1;0.1;0 | {4, F2.1, 0;6, F2.1, 0} | 64K- > 8M | 1/25- > 1/10-5 | -30- > 60 |
T37 | Barrel | 173×83.4×84.2 | 390 | CMOS, 1/3.0 | {1280×960} | 0.1;0.1;0 | {2.8, F2.1, 0;4, F2.1, 0;6, F2.1, 0} | 64K- > 4M | 1/25- > 1/10-5 | -10- > 60 |
T38 | Dome | 120×120×129 | 328 | CMOS, 1/2.7 | {2304×1296} | 0.1;;0 | {3.35, F2.4, 0} | 64K- > 2M | 1/25- > 1/10-5 | -30- > 50 |
T39 | Barrel | 173×83.4×84.2 | 390 | CMOS, 1/2.7 | {2560×1440} | 0.01;; | {4, F1.6, 0;6, F1.6, 0;8, F1.6, 0;12, F1.6, 0} | 256K- > 6M | 1/25- > 1/10-5 | -30- > 60 |
T40 | Ball | 230×177×188 | 930 | CMOS, 1/3.0 | {2560×1440} | 0.002;;0 | {4, F1.6, 0} | 64K- > 6M | 1/25- > 1/10-5 | -30- > 60 |