Global observing system



Download 2.86 Mb.
Page27/28
Date02.02.2017
Size2.86 Mb.
#15391
1   ...   20   21   22   23   24   25   26   27   28

For further treatment of data it is necessary to keep the results of the E-QC data quality control together with the information on how suspect or wrong data were treated (using sophisticated system of flags). The output of the quality control system should include QC flags that indicate whether the measurement passed or failed, as well as a set of summary statements about the sensors.

Every effort has to be made to fill data gaps, correct all erroneous values and validate doubtful data detected by QC procedures at the Data Processing Centre choosing appropriate procedures.



CHAPTER IV QC MONITORING

As real time quality control procedures have their limitations and some errors can go undetected, such as sensor drift or bias, as well as errors in data transmission, performance monitoring at the network level is required at meteorological data processing centres and by network managers.

Effective real time QC monitoring as an integral part of a QC system has to include checks of the following items:


  • Completeness of observations at the meteorological station;

  • Quality of data;

  • Completeness and timeliness of the collection of observational data at the centre concerned.

QC monitoring is intended to identify deficiencies and errors, monitor them and activate appropriate remedial procedures. Some assessment can be and should be performed in real time, whereas other evaluations can only be accomplished after gathering of sufficient data over a longer period.

QC monitoring requires the preparation of summaries and various statistics. Therefore, it is necessary to built up a QC Monitoring System which has to collect different statistics on observational errors of individual meteorological variables, through a series of flags indicating the results of each check, and generate hourly, daily, weekly, monthly and yearly summaries of:



  • The total number of observations scheduled and available for each variable (completeness of data);

  • The total number of observations which failed the QC checks for each variable (quality of data) in case of:

      • Plausible value check,

      • Time consistency check,

      • Check on a maximum allowed variability of an instantaneous value,

      • Check on a minimum required variability of instantaneous values,

      • Internal consistency check;

  • The percentage of failed observations (quality of data);

  • The error and threshold values for each failed observation (reason of failure);

  • Root mean square (RMS) error / mean error / percentage failure for failed observations for each station (daily/weekly/monthly/yearly) (quality statistics).

Stations with large percentages of failed observations are probably experiencing hardware or software failures or inappropriate maintenance. These should be referred back to the network manager.

The QC Monitoring System has to keep station monitoring statistics on the frequency and magnitude of observation errors encountered at each station. The statistics provide information for the purpose of:



  • Monitoring quality of station performance,

  • Locating persistent biases or failures in observations,

  • Evaluating improvement of quality of observation data, performance and maintenance of station/network.

REFERENCES

1
2
3

4

5



6

7

8


9

10

11



International Organization for Standardization, Second Edition, Quality Management and

Quality Assurance, Vocabulary, ISO 8402

World Meteorological Organization, 1996, Guide to Meteorological Instruments and Methods of Observation, WMO-No. 8

World Meteorological Organization, 1993, Guide on GDPS, WMO-No. 305

World Meteorological Organization, 2001, Manual on Codes, WMO-No. 306, Volumes I.2

World Meteorological Organization, 1992, Manual on GDPS, WMO-No. 485, Volume I.

World Meteorological Organization, 1989, Guide on GOS, WMO-No. 488

World Meteorological Organization, 2003, Manual on GOS, WMO-No. 544, Volume I.



Automated Surface Observing System (ASOS) User’s Guide
www.nws.noaa.gov/asos/aum-toc.pdf

The Impact of Unique Meteorological Phenomena Detected by the Oklahoma Mesonet and ARS Micronet on Automated Quality Control, Fiebrich, C.A., Crawford, K.C., 2001, Bulletin of the American Meteorological Society, Vol. 82, No. 10.
http://hprcc.unl.edu/aws/publications.htm
Quality Control of Meteorological Observations, Automatic Methods Used in the Nordic Countries, Report 8/2002, Flemming Vejen (ed), Caje Jacobsson, Ulf Fredriksson, Margareth Moe, Lars Andresen, Eino Hellsten, Pauli Rissanen, Ţóranna Pálsdóttir, Ţordur Arason
http://www.smhi.se/hfa_coord/nordklim/

Implementing an enhanced and integrated quality assurance and quality control system within the MSC's new Data Management Framework, L. Dale Boudreau, A. Zucconi.

http://ams.confex.com/ams/Annual2006/techprogram/paper_100879.htm


P A R T VII
MONITORING THE OPERATION OF THE GLOBAL OBSERVING SYSTEM
7.1 GENERAL
The WWW Plan provides for the monitoring of the operational performance of its various components in order to evaluate their efficiency, to identify deficiencies and to take corrective actions with a view to maintaining the overall efficiency and effectiveness of the WWW on a global, regional and national level.
As the operations of the three core elements of the WWW, namely the GOS, the GDPFS and the GTS, are so closely interrelated, each element cannot be monitored independently. For this reason, in order to monitor the WWW as an integrated system, close co-ordination between all centres concerned and the WMO Secretariat is essential if deficiencies are to be identified and corrective actions initiated quickly.
The “Plan for Monitoring the Operation of the WWW” is reproduced in the Manual on the GOS as Part VII as well as in the Manual on the GDFPS (WMO-No. 485) and in the Manual on the GTS (WMO-No. 386). According to this Plan, the monitoring is performed on a real-time basis and on a non-real-time basis. Explanations of these terms, as well as the procedure for follow-up actions are given in the Plan.
7.2 IMPLEMENTATION OF THE MONITORING OF THE GOS
7.2.1 Monitoring of the availability of observational dataQuantity monitoring of the operation of the WWW
The periodic status report on the implementation of the WWW, issued by the WMO Secretariat at two-year intervals, includes statistics concerning the availability of observational reports of various kinds. is designed to inform the management of the NMHSs of the operational status of WWW; it provides information concerning the structure, status and trends of the implementation, as well as performance of the core components of WWW, including different statistics of availability of observational reports and data. The current version of the Status Report can be downloaded from www.wmo.int/web/www/StatusReport.html. The information is based on monitoring on a non-real-time basis, carried out over a specified 15-day period preceding the preparation of the report. Such limited monitoring is, however, carried out every year although the status report is published in alternate years.

Three types of quantity monitoring are co-ordinated by the WMO Secretariat within the framework of the WWW Programme:



  • The Annual Global Monitoring (AGM)

  • The Special MTN Monitoring (SMM)


7.2.1.1 Annual Global Monitoring

The AGM is carried out in October each year. The WWW centres are invited to monitor SYNOP, TEMP, PILOT, CLIMAT and CLIMAT reports from the RBSN stations in accordance with the responsibility taken for the exchange of data on the GTS:



  • The NMCs should monitor data from their own territory;

  • RTHs should at least monitor data from their associated NMCs, and possibly from their own Region;

  • WMCs and RTHs located on the MTN should monitor the complete global data set.

Each year about 100 WWW centres send their monitoring results to the WMO Secretariat through the Internet, on diskette or on paper.


The results of the AGM make it possible to compare the availability of the reports received from RBSN stations at the NMC responsible for inserting the data in the Regional Meteorological Telecommunication Network (RMTN), at the associated RTH and at MTN centres. The differences in the availability of data between centres are due to the following main reasons: differences of requirements in the reception of data, shortcomings in the relay of the data on the GTS, data not monitored, differences in the implementation of the monitoring procedures at centres.
The AGM has the following limitations:

  • It provides monitoring information over a limited period each year;

  • It provides information at the report level but no information at the bulletin level for RBSN stations;

  • The differences in the implementation of the monitoring procedures at centres lead to differences in the availability of reports between centres.


7.2.1.2 Special MTN monitoring
With a view to complementing the AGM, SMM was implemented. Taking into account the limited resources available at WWW centres to carry out the monitoring activities, it was agreed to share the workload of the SMM between the MTN centres.
One of the main features of the SMM is that the sets of messages (also called raw data) provided by the various MTN monitoring centres are processed by a pre-analysis centre (unique for each type of data). This feature aims at eliminating the discrepancies in the availability of data reported by monitoring centres due to differences in the implementation of monitoring procedures like it is the case for the AGM, primarily due to different methods of counting the reports. The objective of the pre-analysis is to prepare files having a data-base structure and containing the information extracted from all the sets of messages provided by the monitoring centres. The pre-analysis files represent a unique reference for each type of data for further analysis. One advantage of the SMM is that it is always possible to access the raw data and read the complete text of the bulletins as received by the monitoring centres. The SMM provides a complete monitoring information at the report and bulletin levels for any further analysis.
The SMM is carried out four times each year: 1-15 January, April, July and October. The responsibilities taken by the MTN centres are given in the hereunder Tables A and B.


Set of data

Centres providing

raw data

Centres preparing pre-analysis of the raw data

Surface data from fixed stations:
SYNOP reports

Algiers, Melbourne, Offenbach, Toulouse, Tokyo

Tokyo

Upper-air data from fixed stations:
Parts A of TEMP, PILOT reports,
Proposed extension: BUFR wind profiler

Melbourne, Nairobi
Toulouse, Tokyo

Tokyo

Climate data:
CLIMAT and CLIMAT TEMP reports

Cairo, Melbourne,

New Delhi, Toulouse



Cairo

Data from marine stations:
SHIP, TEMP SHIP, PILOT SHIP, BUOY, BATHY/TESAC/TRACKOB reports

Cairo, Melbourne, Offenbach, Toulouse

Offenbach

Data from aircraft:
AIREP and AMDAR reports,
Proposed extension: BUFR aircraft reports

Melbourne, Nairobi
Toulouse, Tokyo

Toulouse

Table A – Responsibilities of SMM centres




Type of data

T1T2

GGgg

SYNOP

SM

0000, 0600, 1200, 1800

TEMP, PILOT

US

0000, 0600, 1200, 1800

CLIMAT

CS

report of the previous month

CLIMAT TEMP

CU

report of the previous month

SHIP

SM

0000, 0600, 1200, 1800

TEMP SHIP, PILOT SHIP

US

0000, 0600, 1200, 1800

BUOY

SS

All bulletins

BATHY/TESAC/TRACKOB

SO

All bulletins

AIREP

UA

All bulletins

AMDAR

UD

All bulletins

Table B – data monitored by the MTN centres


After receiving the background information, RTH Toulouse and the Secretariat make an analysis of the monitoring results.

The results of the analysis of the last AGM and SMM exercises made by the WMO Secretariat can be accessed from www.wmo.int/web/www/ois/monitor/monitor-home.htm, where more information about the Quantity monitoring of the operation of the WWW can also be found.



 A project on Integrated WWW Monitoring is under the development and will be available for use in the near future.
The Plan for Monitoring the Operation of the WWW mentioned above states that, in the context of monitoring, the GOS is responsible for ensuring that the observations are made according to the prescribed standards, are encoded correctly and are presented for transmission at the stipulated times. Monitoring of the GOS is thus essentially a question of quality control of the observations. The basic rules of quality control within the framework of the GOS are contained in the Manual on the GOS, (WMO-No. 544), Volume I, Part V. Detailed instructions about the quality control procedures which Members are invited to follow are given in Part VI of the present publication. Additional information can be also found in the Guide to Meteorological Instruments and Methods of Observation (WMO-­No. 8), Chapter 3, Part III.
7.2.2 Data quality monitoring
7.2.2.1 Monitoring centres
For the assessment of data quality, a number of data processing centres compare the information received from each of the different types of observations with the first-guess numerical short-term forecast. The participating centres produce monthly reports of the various observational data that are of consistently low quality. These lists of 'suspect' data are exchanged between participating centres, and communicated to the originating country for remedial action. To assist in this action, national focal points have been designated. This feedback leads to improvements in the quality of observational data and ultimately to an improved initial analysis state and better model forecasts.
The following centres participate in the monthly monitoring of data quality:


ECMWF

Monthly report containing monthly suspect lists of marine observations, radiosonde observations, aircraft observations, satellite observations and COSNA monitoring.

RSMC Bracknell

Monthly report containing monthly suspect lists of land observations, marine observations, radiosonde observations, aircraft observations and satellite observations.

WMC Melbourne

Monthly report containing monthly suspect lists of land observations, marine observations and radiosonde observations.

RSMC Montreal

Monthly report containing monthly suspect lists of land observations, marine observations, radiosonde observations, aircraft observations and satellite observations.

RSMC Offenbach

Monthly report containing monthly suspect lists of land observations.

RSMC Tokyo

Monthly report containing monthly suspect lists of land observations, marine observations, radiosonde observations, aircraft observations and satellite observations.

RSMC Toulouse

Monthly report containing monthly suspect lists of land observations, marine observations, radiosonde observations, aircraft observations, satellite observations and COSNA monitoring.

Lead Centres have been established by CBS for coordinating the monitoring results of specific observation types. The Lead Centres produce six-monthly consolidated reports of the observations with data of consistently low quality. These reports are also known as 'suspect' lists. The Lead Centres are as follows:




Centre

Data Type

Area of responsibility

WMC Washington

aircraft and satellite data

global

RSMC ECMWF

upper-air data

global

RSMC Bracknell

surface marine data

global

RSMC Nairobi

Land surface observations

RA I

RSMC Tokyo

Land surface observations

RA II

RSMC Buenos Aires

Land surface observations

RA III

RSMC Montreal

Land surface observations

RA IV

WMC Melbourne

Land surface observations

RA V

RSMC Offenbach

Land surface observations

RA VI

7.2.2.2 Procedures and formats for exchange of monitoring results


Approved quality monitoring procedures and formats for the exchange of the monitoring results for surface and upper-air data including marine, aircraft and satellite data have been developed, updated, and published in the Manual on the GDPFS (WMO-No. 485), Attachment II.10. The six-monthly consolidated suspect reports are distributed to Members so that they can take remedial action as required. These Members/agencies then report to lead centres and the WMO Secretariat on their remedial efforts.
More information on the data quality monitoring, monitoring procedures, report types can be found at www.wmo.int/web/www/DPS/Monitoring-home/mon-index.htm, and e.g. at:

http://www.met-office.gov.uk/research/nwp/observations/monitoring/marine/index.html.


7.2.2.2 Role of the appointed lead centres
7.2.2.2.1 Real-time feedback
The feedback of data-monitoring results to the data producers is of prime importance, not only in delayed mode but also in near-real time. This was established through a pilot study undertaken by RSMC ECMWF exchanging near-­real-time information on the performance of upper-air stations with focal points directly responsible for the operation of such stations. The lead centres, in co-operation with other GDPFS centres, are encouraged to widen their contacts with data providers. Errors identified in one or more data items through real-time quality -control procedures can be immediately communicated back to the data producers for correction.
7.2.2.2.2 Exchange of consolidated results
The consolidated lists of suspect stations and data platforms compiled by the lead centres contain those stations for which it has been established with confidence that they produce observations of consistently low quality. Where possible, the problem is defined through clear evidence. Recognizing the fact that deteriorations in observation quality can be detected on time scales much shorter than six months, which is the current interval for producing consolidated monitoring information, each lead centre determines the appropriate response time for communicating suspect stations, observing plat­forms or systems to the appropriate focal points, the WMO Secretariat and other GDPFS centres.
The reports attached to the consolidated lists sent to WMO should be short. They may have a technical attachment, and it should also be made clear that detailed information can be provided by the lead centre on request.
7.2.2.3 Procedures and formats for the exchange of monitoring results
7.2.2.3.1 General remarks
Centres participating in the exchange of monitoring results will implement standard procedures and use agreed formats for communicating the information to both other centres and the data providers. The following list is incomplete and requires further development in the light of practical experi­ence. Guidance will be given through the initiative of the lead centres in their corresponding fields of responsibility.
In view of the fact that the monthly lists of suspect stations could be misinterpreted if the methods of compilation are not completely under­stood, they should be circulated only to those centres which indicate that they would like to participate in the monitoring programme. In addition, they should contain a clear explanation of the criteria used and the limitations of the system.
7.2.2.3.2 Upper-air observations
Monthly exchange of monitoring results for upper-air observations should include lists of stations/ships with the following information:



  • List 1: GEOPOTENTIAL HEIGHT

Month/year

Monitoring centre

Standard of comparison (first-guess/background field)



Selection criteria:
For 0000 and 1200 UTC separately, at least three levels with ten observations during the month and 100 m weighted rms departure from the field used for comparison between 1000 hPa and 50 hPa.
The gross error limits to be used for observed minus reference field are as follows:

Level (hPa) Geopotential height (m)

1000 100


925 100

850 100


700 100

500 150


400 175

300 200


250 225

200 250


150 275

100 300


70 375

50 400
Weights to be used at each level are as follows:



Level (hPa) Weight

1000 3.70

950 3.55

700 3.40


500 2.90

400 2.20


300 1.60

250 1.50


200 1.37

150 1.19


100 1.00

70 0.87


50 0.80

Data to be listed for each selected station/ship should include:


WMO identifier;

Observation time;

Latitude/longitude (for land stations);

Pressure of the level with largest weighted rms departure;

Number of observations received (including gross errors);

Number of gross errors;

Percentage of observations rejected by the data assimilation;

Mean departure from reference field;

Rms departure from reference field (unweighted).
Gross errors should be excluded from the calculation of the mean and rms departures; they should not be taken into account in the percentage of rejected data (in either the numerator or the denominator).


  • List 2: WIND

Month/year

Monitoring centre

Standard of comparison (first-guess/background field)


Selection criteria:
For 0000 and 1200 UTC separately, at least one level with ten observations during the month and 15 m.s-1 rms vector departure from the field use for comparison, between 1000 hPa and 100 hPa.
The gross error limits to be used are as follows:

Level (hPa) Wind (m.s-1)

1000 35


925 35

850 35


700 40

500 45


400 50

300 60


250 60

200 50


150 50

100 45
Data to be listed for each selected station/ship should include:


WMO identifier;

Observation time;

Latitude/longitude (for land stations);

Pressure of the level with largest rms departure;

Number of observations received (including gross errors);

Number of gross errors;

Percentage of observations rejected by the data assimilation;

Mean departure from reference field for u-component;

Mean departure from reference field for v-component;

Rms vector departure from reference field.


Gross errors should be handled in the same way as for List 1.


  • List 3: WIND DIRECTION

Month/year

Monitoring centre

Standard of comparison (first-guess/background field)


Selection criteria:
For 0000 and 1200 UTC separately, at least five observations at each standard level from 500 to 150 hPa, and, for the average over that layer, mean departure from reference field at least +/-10 degrees, standard deviation less than 30 degrees, maximum vertical spread less than 10 degrees.
Same limits for gross errors as above; data for which the wind speed is less than 5 m.s-1, either observed or calculated, should also be excluded from the statistics.
Data to be listed for each selected station/ship should include:
WMO identifier;

Observation time;

Latitude/longitude (for land stations);

Minimum number of observations at each level from 500 to 150 hPa (excluding gross errors and data with low wind speed);

Mean departure from reference field or wind direction, averaged over the layer;

Maximum spread of the mean departure at each level around the average:

Standard deviation of the departure from reference field, averaged over the layer.
7.2.2.3.3 Marine surface observations
(a) Monthly exchange of monitoring results for marine surface observations should include lists of observing platforms arranged as follows:


List 1: Mean sea-level pressure

List 2: Wind speed

List 3: Wind direction


}

from ship, moored buoys other fixed marine platforms










List 4: Mean sea-level pressure

List 5: Wind speed

List 6: Wind direction


}

from drifting buoys

(b) Each list should contain the following information:

(i) Month/year;

Monitoring centre;

Standard of comparison (first guess/background field);
(ii) The following data for each selected platform:
WMO identifier;

Average latitude/longitude over the month (for Lists 4-6 only);

Number of observations received (including gross errors);

Number of observations containing gross errors;

Percentage of observations containing gross errors;

Standard deviation of the departures from the reference field;

Mean departure from the reference field;

RMS departure from the reference field;


(Gross errors should be excluded from the calculation of the mean, standard deviation and RMS departures. For Lists 3 and 5, data for which the wind speed is less than 3 m.s-1, either observed or calculated, should also be excluded from all the statistics);
(c) The selection criteria for observing platforms in each of the lists are as follows:


  • LIST 1: MEAN SEA LEVEL PRESSURE FROM SHIPS, MOORED

BUOYS AND OTHER FIXED MARINE PLATFORMS
Selection criteria:
For 0000, 0600, 1200 and 1800 UTC combined, at least 20 observations during the month and at least one of the following:
The absolute value of the mean difference from the reference field is at least 4 hPa;
The standard deviation of the differences front the reference field is at least 6 hPa;

At least 25 per cent of observations have gross errors.


(The gross error limit to be used for observed minus reference field is 15hPa).


  • LIST 2: WIND SPEED FROM SHIPS, MOORED BUOYS AND

OTHER FIXED PLATFORMS
Selection criteria:
For 0000, 0600, 1200 and 1800UTC combined, at least 20 observations during the month and at least one of the following:
The absolute value of the mean difference from the reference field is at least 5 m.s-1;
At least 25 per cent of observations have gross errors.
(The gross error limit to be used for observed minus reference field (vector wind difference) is 25 m.s-1).


  • LIST 3: WIND DIRECTION FROM SHIPS, MOORED BUOYS AND

OTHER FIXED PLATFORMS

Selection criteria:
For 0000, 0600, 1200 and 1800 UTC combined, at least 20 observations during the month and at least one of the following:
The absolute value of the mean difference from the reference field is at least 30 degrees;
The standard deviation of the differences from the reference field is at least 80 degrees.
(The gross error limit to be used for observed minus reference field (vector wind difference) is 25 m.s-1).


  • LIST 4: MEAN SEA-LEVEL PRESSURE FROM DRIFTING BUOYS


Selection criteria:
For all data times combined, at least 20 observations during the month and at least one of the following:
The absolute value of the mean difference from the reference field is at least 4 hPa;
The standard deviation of the differences from the reference field is at least 6 hPa;
At least 25 per cent of observations have gross errors.
(The gross error limit to be used for observed minus reference field is 15 hPa).


  • LIST 5: WIND SPEED FROM DRIFTING BUOYS


Selection criteria:
For all data times combined, at least 20 observations during the month and at least one of the following:
The absolute value of the mean difference from the reference field is at least 5 m.s-1;
At least 25 per cent of observations have gross errors.
(The gross error limit to be used for observed minus reference field (vector wind difference) is 25 m.s-1).


  • LIST 6 : WIND DIRECTION FROM DRIFTING BUOYS


Selection criteria:
For all data times combined, at least 20 observations during the month and at least one of the following:
The absolute value of the mean difference from the reference field is at least. 30 degrees;
The standard deviation of the differences from the reference field is at least 80 degrees.
(The gross error limit to be used for observed minus reference field (vector wind difference) is 25 m.s-1.
7.2.2.3.4 Land surface observations
(To be completed with information from lead centres)

REFERENCES
Manual on the Global Data- pProcessing and Forecasting System (WMO-No. 485).
Manual on the Global Observing System (WMO-No. 544).
Manual on the Global Telecommunication System (WMO-No. 386).

____________


Directory: pages -> prog -> www -> OSY
www -> Cyclone programme
www -> World meteorological organization technical document
www -> Regional Association IV (North America, Central America and the Caribbean) Hurricane Operational Plan
www -> World meteorological organization ra IV hurricane committee thirty-fourth session
www -> World meteorological organization ra IV hurricane committee thirty-third session
www -> Review of the past hurricane season
www -> Ra IV hurricane committee thirty-fourth session ponte vedra beach, fl, usa
www -> World meteorological organization ra IV hurricane committee thirty-second session
OSY -> Implementation plan for the evolution of the surface- and space-based sub-systems of the gos
OSY -> Commission for basic systems open programme area group on integrated observing systems expert team meeting

Download 2.86 Mb.

Share with your friends:
1   ...   20   21   22   23   24   25   26   27   28




The database is protected by copyright ©ininet.org 2024
send message

    Main page