PATRONAGE

Ministry of Science and Higher Education

Ministry of Economic Development

Polish Academy of Sciences
Committee of Metrology and Measurement Systems

Polish Society for Measurement,
Automatic Control and Robotics

Leszek Dorula 

The Mayor of Zakopane

Polish Centre for Accreditation

 

INVITED SPEAKERS

Plenary I - THE DEFINITION OF THE KELVIN IN THE NEW SI: ITS RATIONALE, IMPLEMENTATION AND IMPLICATIONS

 Michael de Podesta
 National Physical Laboratory, United Kingdom
 

Michael de Podesta received a B.Sc. (Hons) in Physics from Sussex University in 1981 and a D.Phil. in Helicon Wave Studies in Potassium in 1985. After postdoctoral work at Bristol University, he was appointed a lecturer at Birkbeck College, London, in 1987 and moved to University College London in 1997. In 2000 he took up his current position at the NPL specialising in temperature measurement. In 2009 he was awarded M.B.E. for Services to Science.
He sits on the BIPM Working Group for Contact Thermometry and Task groups for the Environment and Kelvin redefinition. He is a member of World Meteor-ological Organisation Commission on Instruments and Methods of Observations Expert Groups A2 and C1 and sits on the Steering Committee of the International Surface Temperature Initiative. In 2013 his team published a low-uncertainty measurement of thle Boltzmann constant and in 2016 they published low-uncertainty estimates of T – T90.

ABSTRACT:

The forthcoming re-definition of four SI base units – the kilogram, kelvin, ampere and mole – represents an evolution in our conception of what we mean by a measurement unit. The rationale for the re-definition of each unit is different, but together they make the SI more coherent than it has been since its inception in 1960. For the kelvin, the redefinition achieves two desirable outcomes: the abandonment of a choice of ‘special temperature and material’, and the acknowledgement – finally – of the fundamental connection between temperature and energy.

However, it is important that these advantages are not wasted in poor implementation. In this regard, we are fortunate that the kelvin is almost universally disseminated by adherence to the International Temperature Scale of 1990 (ITS-90), and there is currently no plan to revise this scale. So if we can avoid confusion about the re-definition, there should be precisely no practical implications of the re-definition – in the short term.

In the longer term, we can expect to notice the impact of the last decade of research with the aim of determining the Boltzmann constant. Extensions of this work will yield improved estimates of the differences between the ITS-90 (T90) and thermodynamic temperature (T), and also reduced-cost apparatus capable of realising T directly, based on measurements of a range of physical properties. In such a future, the task facing a hypothetical International Temperature Scale 20XX becomes daunting. At its initiation TXX will be close to our best estimates of T, and in replacing the well-established ITS-90, the differences between T90 and TXX will need to be communicated. However some laboratories may be independently realising T directly, and great care will be needed to communicate the significance of these different estimates of the temperature.

Plenary II - THE STATUS AND FUTURE OF JOHNSON NOISE THERMOMETRY

 Jifeng Qu
 National Institute of Metrology, China

Jifeng Qu was born in Xi’an, China, on December 16, 1978. He received the B.S. degree in materials physics and the Ph.D. degree in condensed matter physics from the University of Science and Technology of China, Hefei, China, in 2001 and 2006, respectively.
During April 2007–October 2009, he was a Guest Researcher on the Johnson Noise Thermometry Program with the National Institute of Standards and Technology, Boulder, CO, investigating electronic nonlinearity using superconducting quantum-based voltage sources.
In November 2009, he joined the National Institute of Metrology, China, where he works on Johnson noise thermometry and quantum voltage standard. He is currently focusing on measurement of the Boltzmann constant with quantum-voltage-calibrated Johnson noise thermometry.

ABSTRACT:
Johnson noise thermometers infer the thermodynamic temperature through measuring the thermally induced electronic fluctuations that occur in all electrical conductors. As a purely electronic approach, Johnson noise thermometry offers an appealing alternative to other forms of primary thermometry and has attracted increasing interest. However, because the noise signal is extremely small, random, and distributed over very wide bandwidths, a number of technological breakthroughs have been required to enable measurements with accuracies comparable to the other primary methods. Over the last few years, Johnson noise thermometry has revealed its potential by contributing to recent measurements of the Boltzmann constant with relative uncertainties of 0.0004 %. It is also useful at high temperatures with relative uncertainties of 0.004 % demonstrated up to 800 K, and could be competitive with acoustic and radiation thermometry in the difficult temperature range of 600 K to 1000 K.

This presentation will review the current status of Johnson noise thermometry and its prospects, for both metrological measurements and practical implementations and its application in industry. We will begin with the foundations of Johnson noise thermometry and the key breakthroughs leading to the modern metrological noise thermometers: the cross correlator, fast analogue-to-digital converters and frequency domain processing, and especially the quantum-accurate pseudo-random noise source developed by NIST. We will then review the current and emerging metrological applications, including the recent Boltzmann constant determination and the implications of the new kelvin definition. Finally, we will consider future prospects including the possibilities for increased adoption of noise thermometry in industry.

Plenary III - ENSURING THE DEVELOPMENT OF HIGH-QUALITY AND TRACEABLE CLIMATE TIME-SERIES IN SUPPORT OF MORE ROBUST CLIMATE CHANGE STUDIES


Manola Brunet
Director of the Centre for Climate Change, University Rovira i Virgili, Tarragona, Spain, Co-chair of WMO/Commission for Climatology OPACE2 on Climate Monitoring and Analysis

Dr Manola Brunet is a Full Professor and Reader in Climatology at the University Rovira i Virgili (URV, Tarragona, Spain) since 1985 and Director of the Centre for Climate Change (C3) at URV (Tortosa, Spain) since 2009. She is also a Visiting Fellow at the Climatic Research Unit, School of Environmental Sciences (University of East Anglia, Norwih, UK).
Since 2005 she has been involved in international activities of the World Meteorological Organization (WMO)/Commision for Climatology (CCl), being currently cochair of the WMO/CCl OPACE2 on Climate Monitoring and Analysis, she is member of the CCl Management Group and she is also coordinating the MEditerranean DAta REscue (MEDARE) Initiative. She is also involved in or contributing to a number of other international activities and bodies (e.g. GCOS, AOPC, ISTI) and has taken part or organised many regional capacity development activities in her areas of expertise.
Her expertise is focused on the research fields of instrumental climate reconstructions and analysis, including data rescue, development of high-quality and traceable climate datasets. She is also contributing to the assessment of spatial and temporal climate variability (either in the mean or extreme states of the climate or at the local, regional and global scales) and to climate change detection studies.

ABSTRACT:

Reliable and robust assessments of climate variability and climate change require the best climate data series possible, while ensuring data series are traceable to international standards. However, meteorological observations and their temporal collection (namely, climate time series) are far away of being of quality and homogeneity proven. These statistical properties have to be guaranteed before using the data in any scientific study or climatic application. The usage of climate data series that do not fulfil these criteria in any climate and climate change assessment could shed doubts on the reliability and robustness of these scientific analyses.
Therefore, this presentation will provide an overview of the most common issues affecting climate data quality and time-series homogeneity from a climatological perspective, emphasising the need to generate climatic records reasonably free of both non-systematic and systematic biases. It will also address the need of estimating the uncertainty budget associated with the adjustments applied to climate time-series, in order to add these uncertainties to the combined budget of uncertainties to ensure a more complete traceability of climate data series, such as it was done in the framework of the MeteoMet/REG5 project. Air temperature records will be at the focus of this presentation and their main biases discussed, along with current strategies to minimise them developed recently by the climatological community. In addition, the impact of using low-quality climate data on the estimation of temperature trends will be discussed. Finally, recent attempts to set up climatological reference stations will be discussed, emphasising the need for strengthening scientific cooperation between metrological and climatological communities.

 

Plenary IV - THERMAL MEASUREMENT CHALLENGES IN ADVANCED MANUFACTURING 

 Gregory F. Strouse
 National Institute of Standards and Technology, USA

Gregory F. Strouse is the Associate Director for Measurement Services of the Physical Measurement Laboratory (PML) at the National Institute of Standards and Technology (NIST), and is a member of the board responsible for assessments of the NIST Quality System. Since joining NIST in 1988, he has become a leading expert in temperature measurement and the realization and dissemination of the International Temperature Scale of 1990 (ITS‐90). He has designed and built up several new world-class facilities including laboratories for the calibration of standard platinum resistance thermometers, thermocouples and industrial thermometers, and he is a NVLAP technical and lead assessor. His current research interests include NIST-on-a-Chip embedded sensors, cold-chain management for vaccines, dynamic pressure sensors and standards, Johnson noise thermometry, acoustic gas thermometry, realization of the Boltzmann
constant, photonic pressure standards and sensors, and development of alternative thermometers.

ABSTRACT:

One goal for improving the dissemination of standards is to reduce the need for the routine exchange of artifacts between an NMI and those seeking measurement assurance, by developing sensors and instruments that are inherently more accurate and stable than those in common use today. Another is the development of innovative sensors that—through improvements in cost, size, speed, durability, and other factors—may be more effectively utilized within manufacturing plants and products. As examples, laser-based cutting, welding, and sintering in additive manufacturing processes would benefit from improved real-time monitoring of process temperature. Similarly, networks of small and precise sensors embedded within structures and composite materials could improve their performance and reliability. These sensors draw upon a range of technologies not previously exploited for these applications, such as nanofabrication, photonics, and atomic physics.

NIST is focusing its efforts on several critical projects including the development of a photonic pressure, temperature, and length standard to determine the Boltzmann constant (a goal of < 10 ppm) and to realize the unit of the kelvin; a Johnson Noise thermometer to determine the Boltzmann constant (< 4 ppm) and as a thermodynamic thermometer (< 10 ppm); and a photonic thermometer (< 5 mK). Embedded dual-mode, quantum-effect devices may be both a standard and sensor. These self-calibrating nanoscale miniaturized sensors include an intrinsic SI traceable standard with the sensor in a multi-function platform thus obviating the need for frequent calibrations. For example, two dual-mode devices in development include 1) a nanoscale opto-mechanical thermometer that as a standard relies on the phonon Boltzmann distribution from the thermal “Brownian” motion (mechanical) and as a sensor that relies on the change in the index of refraction (optical) with a noise floor of few nK and a dynamic response time of few µs; and 2) a chip-scale Johnson Noise thermometer—the first user-friendly thermodynamic thermometer. We envision combining these devices with a variety of other sensors on a standard platform in a multiplexed network, creating a nanoscale, multi-function thermodynamic sensor, with broad-ranging applications in advanced manufacturing.

Plenary V - ADVANCES AND PROSPECTS IN HIGH TEMPERATURE RADIOMETRY

 Boris Khlevnoy
All-Russian Research Institute for Optical and Physical Measurements (VNIIOFI), Russia

Boris Khlevnoy graduated from the physical department of the Lomonosov Moscow State University in 1985, and was awarded his PhD degree in metrology in 2001. Since 1985 he has been working at the All-Russian Research Institute for Optical and Physical Measurements (VNIIOFI), where he is responsible for national standards of spectral radiance and spectral irradiance. With his colleagues he developed a set of high-temperature blackbodies and furnaces, which are widely used for radiometric and temperature measurements at VNIIOFI and some other NMIs worldwide. He has contributed to development of methods of measuring the thermodynamic temperature of a blackbody and to developing and investigating HTFPs, including the determination of their thermodynamic temperatures. He is a co-opted member of CCT-WG-NCT, represents Russia in CCPR WC on key comparisons and leads a COOMET TC on photometry and radiometry.

ABSTRACT:

High-temperature radiometry covers the UV, visible and near IR spectral ranges, and needs high-temperature blackbodies (HTBBs) at temperatures up to approximately 3000 K. HTBBs are relatively well developed, but determining their temperature was until recently the main problem of using them in radiometry. This situation has radically changed with introduction of M(C)-C high temperature fixed points (HTFPs). The excellent repeatability of HTFPs makes them perfect reference standards that can significantly improve establishing and maintaining high temperature scales and, therefore, radiometric and photometric quantities based on HTBBs. The demonstrated reproducibility of HTFPs gives a sound metrological basis for high accuracy monitoring of long-term (potentially) slowly varying radiometric processes. The recent assignment of thermodynamic temperature to three HTFPs (Co-C, Pt-C and Re-C) has led to a reduction of uncertainties in radiometric and photometric measurements. HTFPs are already used routinely for improving the measurement of HTBB temperatures: developing large area HTFP cells will give further advances.

Two relative radiometry methods of HTBB thermodynamic temperature measurement will be discussed. The first is based on a comparison of two standard radiation sources: blackbody and synchrotron radiation (SR). The relative spectrum of SR in the visible and IR can be determined with high accuracy. Comparing BB with SR, we can measure a ratio of the BB spectral components at any two wavelengths, and then determine its temperature. PTB and VNIIOFI have initiated a joint activity to realize this method that will be continued within the InK-2 project. The second method, suggested by Wulfson almost 70 years ago, is based on the measurement of ratios of spectral radiances of two blackbodies with two different temperatures at two wavelengths. The uncertainty of this method decreases with increasing differences between the temperatures and between the wavelengths. The specialty of this method is that it does not need other standard sources or detectors: the Planck law is self-sufficient for determining thermodynamic temperatures and thus all other blackbody based radiometric quantities. However, the method has not, until now, been implemented because of too high specifications required for the measurements. For instance, it needs very stable and reproducible blackbodies. With the advent of HTFP blackbodies this method can be experimentally realized and compared to the more traditional filter radiometer approach providing an independent verification of that method.

INVITED PAPER - ACCURATEEXPERIMENTALDETERMINATIONOF THEISOTOPE EFFECTS ON THE TRIPLE POINT TEMPERATURE OF WATER

Harro Meijer
Centre for Isotope Research (CIO), Energy and Sustainability Research Institute Groningene

Harro (H.A.J.) Meijer holds a PhD in atomic and molecular physics from Utrecht University. After postdoctoral periods in Germany and Denmark he got a position at the Centre for Isotope Research of the University of Groningen. He became full professor and leader of that group in 1999. In 2009 he founded the Energy and Sustainability Research Institute Groningen (ESRIG) of which he is the director. Applications using the stable isotopes of water, both in natural systems and for biomedical research using isotope labelling, form one of his lines of research. The work on the triple point of water fits into this line.

ABSTRACT:

Variation in the isotopic composition of water is one of the major contributors to uncertainty in the realization of the triple point of water (TPW). Although the dependence of the TPW on the isotopic composition of the water has been known for years, there was still a lack of a detailed and accurate experimental determination of the values for the correction constants. In a collaborative study, our two groups addressed the quantification of isotope abundance effects on the triple point temperature of water. We dealt with the dependence on 2H and 17,18O separately. We manufactured two times five triple point cells with accurately prepared water mixtures. The first set of five had a range of 2H isotopic abundances encompassing widely the natural abundance range, while the 18O and 17O isotopic abundances were kept approximately constant and the 18O − 17O ratio was close to the Meijer–Li relationship for natural waters. The selected range of 2H isotopic abundances led to cells that realised TPW temperatures between approximately −140 μK to +2500 μK with respect to the TPW temperature as realized by VSMOW (Vienna Standard Mean Ocean Water). 

For the O-side of the isotope dependence, we decided to combine the 18O and the (much smaller) 17O effects into a single coefficient, as the 17O-18O relation in natural water is very constant. In the second set of five triple point cells therefore the 18O and 17O abundances exceeded widely the natural abundance range, while maintaining the Meijer–Li relationship; furthermore the 2H isotopic abundance was kept close to that of VSMOW. These cells realized triple point temperatures ranging between −220 μK to 1420 μK with respect to the TPW temperature of VSMOW.

Our studies lead to the recommendation of a new correction equation: with A2H = (673±4) μK /(‰δ2H) and AO = (630±10) μK/ (‰δ18O) (k=1, or 1-σ) Using our correction equation, the uncertainty in the isotope correction for triple point cell waters used around the world will be <1 μK. 

Reference: V. Faghihi et al., Metrologia 2015, 52, 819 & Metrologia 2015, 52, 827.

© Tempmeko 2016 r.