Global environmental and environmental management issues are environmental and analytical environmental monitoring. Informational and analytical monitoring technology

The information and analytical system of the KPS "Monitoring-Analysis" allows control over the process of customs clearance in the field of nomenclature, cost, weights of the goods are issued, the accrual of customs payments.

"Monitoring Analysis" implements the integration process for various information sources (database of GTD, DB TP NSI, BD Eagruck, EGRN BD) and the subsequent used accumulated (aggregated) data for the formation of different reports and references.

"Monitoring analysis" performs the following functions:

- ensuring access to the CBD GTD, as well as the CBD of customs parish orders (TPO);

- providing the possibility of creating and editing conditions that limit the sample of data from the CBD GTD;

- visual display and output report information;

- adjustment of received reports in Microsoft Excel.

Information on the activities of customs authorities in the field of customs clearance of GTD to "Monitoring Analysis" is presented in different criteria, including:

- cost, weight and nomenclature of goods issued;

- accrued payments;

- country of origin and country of direction of moved goods;

- customs clearance participants (customs authorities, customs inspectors, participants of the WAD);

- Dynamics of customs clearance processes.

"Monitoring Analysis" makes it possible to receive both general data of the customs clearance of goods and detailed information on each of the VED participants, a specific warehouse and a customs inspector.

Additionally, "monitoring analysis" provides an opportunity for access (analysis and control) to the delivery processes of goods under customs control.

Monitoring analysis "has a pronounced three-level structure. The user (via the Internet conductor) sends a request to the www server. The www server transmits a request to Oracle DBMS. The DBMS processes the request and returns the WWW server.

The www server in turn converts the received data to the HTML page and returns the result to the user. Therefore, all updates of the software "Monitoring-analysis" software occur on the WWW server and in the Oracle DBMS. Changes in software, respectively, become available to the user.

- CBD TPO - monitoring the processes of customs clearance of TPOs on TPO CDD;

- CBD DKD - monitoring the delivery processes by customs control (access to DB Delivery-CDD);

- Search by EGRN, Enjoyment - Search for information about legal entities - participants in customs clearance processes.

3. GENERAL INFORMATION ON AC ADPR "Analysis-2000

In the Database of EAIS FCS of Russia, the treated huge amounts of information on various aspects of customs activities are kept, including electronic copies of cargo customs declarations (GTD) and customs parietal orders (issued by the Customs of Russia, since 1991), the database growth rates on average are 600 thousand records per quarter (about 2.5 million per year). This data array contains the most valuable information on Russia's foreign economic activity.

Significant amounts of information on foreign economic activity of Russia require effective processing tools to provide customs management decision-making processes.

The first step in creating a full-scale decision-making support system (SPRD) of the corporate level was the processing of an operational multi-dimensional electronic data analysis custom documentswhich provides a new level of data analysis and incomparable statistical analysis of performance indicators.

Systemic Tissomization of the Analytics-2000 system:

- reduction of time and labor costs needed to obtain aggregated information;

- Increased labor productivity of FCS;

- improving the quality of analytical data issued at the request of higher organizations;

- providing opportunities to leadership and middle managers, as well as analysts to navigate in huge amounts of data and choose the information necessary for decision-making;

- ensuring a graphical representation of data.


Analytical quality control ambient Consists of the following stages:

1. Choose a sample selection site;

2. Sampling;

3. Sample processing;

4. Measurement of the concentration of the pollutant;

5. Mathematical data processing and verification;

6. Interpretation and comparison of the data obtained.

For choice of sampling (gaseous, liquid, solid) should be considered factors: geographical, geological and environmental features district, the nature of the distribution of pollution in time and space, meteorological and hydrological conditions.

You can "cover" the studied area of \u200b\u200bthe grid with a suitable step scale and take samples in all nodal points. In other cases, you can take samples in characteristic places with different expected pollution. The number of sampling points on this territory depends on the technical and economic opportunities of the station or post.

For sampling It is necessary to obtain a statistically averaged sample, which is most easily achieved when the liquid sample is selected. Obtaining a statistically averaged sample of soil or biota is taken by a series of samples at different points, followed by mechanical averaging in a ball mill or by dissolving in acids. An analytical sample of a certain mass is selected from the material obtained. A statistically averaged air sample can be obtained by pumping large volumes through special filters or liquid absorbers and then leaning the absorbed pollutant with a special solution.

Treatment of trial It can be carried out immediately after the selection. If storage or transportation is assumed, the loss of the pollutant should be taken into account due to adsorption on the walls of the vessel, the sedimentation of particles or chemical reactions. The most suitable is dishes from polyethylene or Teflon. The liquid test is poured "under the plug". Solid samples are also isolated from contacts with air. Other types of sample treatment are associated with concentration and separation of pollutants, which is usually done in a special laboratory.

Stage "Measurement" It is an analytical determination of the concentration of the pollutant, including the choice of the method of analysis, the preparation of the sample according to the propagation of the method, calibration of instruments, testing the method using standards, carrying out idle experiments.

When monitoring, use standard or generally accepted interdepartmental or departmental analysis methods.

Previous materials:

Page 31 of 45

Ecological and analytical environmental monitoring.

Ecological and analytical monitoring - Monitoring the content of pollutants in water, air and soil using physical, chemical and physicochemical methods of analysis - allows to detect the flow of pollutants into the environment, to establish the effect of anthropogenic factors against the background of natural and optimize human interaction with nature. So, soil monitoringit provides for the determination of acidity, soil salinity and loss of humus.

Chemical Monitoring -part of the environmental-analytical, is a system of observation of the chemical composition of the atmosphere, precipitation, surface and groundwater, oceans and seas, soils, bottom sediments, vegetation, animals and control over the dynamics of the propagation of chemical pollutants. His task is to determine the actual level of environmental pollution by highly toxic ingredients; Purpose - Scientific and technical support of the system of observations and forecasts; identifying sources and factors of pollution, as well as their degree of impact; Observing the established sources of contaminants in the natural environment and the level of its pollution; Evaluation of the actual pollution of the natural environment; Forecast for pollution of the environment and ways to improve the situation.

Such a system is based on sectoral and regional data, includes the elements of these subsystems; It can embrace both local areas within one state (National Monitoring)so I. Earth generally (Global Monitoring).

Ecological and analytical monitoring of pollution consisting of a unified state system of environmental monitoring.In order to radically increase the efficiency of working on the conservation and improvement of the environment, ensuring environmental safety on November 24, 1993, a government decree was adopted Russian Federation № 1229 "On the creation of a unified state system of environmental monitoring" (ESSEM). The organization of work on the creation of HSEM provides for the inclusion in the scope of new types and types of pollutants and identifying their environmental impact; Expanding the geography of environmental monitoring at the expense of new territories and sources of pollution.

The main tasks of HSEMM:

- development of environmental monitoring programs in Russia, in its individual regions and districts;

- organization of observation and measurements of indicators of environmental monitoring objects;

- the accuracy and comparability of observational data both in separate regions and districts and throughout Russia;

- collection and processing of observation data;

- storage of observation data, the creation of special data banks characterizing the environmental situation in Russia and in its separate areas;

- harmonization of banks and the bases of environmental information with international environmental information systems;

- assessment and forecast of the state of environmental objects and anthropogenic impacts on them, natural resources, responses of ecosystems and public health to change the condition of human habitat;

- conducting operational control and precision measurements of radioactive and chemical pollution as a result of accidents and disasters, as well as forecasting the environmental situation and assess the damage caused by natural environment;

- the availability of integrated environmental information in a wide range of consumers, social movements and organizations;

- informing the authorities for the state of the environment and natural resources, environmental safety;

- Development and implementation of a unified scientific and technical policy in the field of environmental monitoring.

ESMEM provides for the creation of two interrelated blocks: monitoring ecosystem contamination and monitoring the environmental consequences of such pollution. In addition, it should provide information on the source (basic) state of the biosphere, as well as the identification of anthropogenic changes against the background of natural natural variability.

Currently, observations of the levels of pollution of the atmosphere, soil, waters and bottom sediments of rivers, lakes, reservoirs and seas in physical, chemical and hydrobiological (for water bodies) are carried out by Roshydromet services. Monitoring sources of anthropogenic impact on the natural environment and the zones of their direct influence on the animal and vegetable world, ground fauna and flora (except forests) are the corresponding services of the Ministry of Earth. The monitoring of the lands, the geological environment and groundwater is carried out by the divisions of the Committee of the Russian Federation on land resources and land management and the Committee of the Russian Federation on geology and the use of subsoil.

In 2000, 150 chemical laboratories were operating in the Roshydromet system, 41 bush laboratory for an analysis of air samples in 89 cities with undelace control. The observations of the pollution of the atmosphere were carried out on 682 inpatient posts in 248 cities and towns of the Russian Federation, no attention was left without attention on agricultural land.

Sushi surface waters are controlled at 1175 watercourses and 151 reservoirs. Sampling is conducted in 1892 points (2604 stem). In 2000, 30,000 water samples were analyzed by 113 indicators. Points of observation of pollution of the marine environment exist on 11 seas that wash the territory of the Russian Federation. In the Roshydromet system annually analyze more than 3000 samples in 12 indicators.

The network of stations of observation of transboundary transfer of pollutants is focused on the western border of Russia. Currently, Pushkin Mountains and Pinega stations are operating here, which are sampling atmospheric aerosols, gases and precipitation.

Control for chemical composition and atmospheric precipitation acidity is carried out at 147 stations of the federal and regional level. In most samples, only the pH value is measured. When tracking snow cover pollution in samples, ammonium ions, sulfations, benz (a) pyrene and heavy metals are also determined.

The system of global atmospheric background monitoring includes three types of stations: basic, regional and regional with an extended program.

Six integrated background monitoring stations are also created, which are located in biosphere reserves: Barguzinsky, Central Forest, Voronezh, Prioksko-terrace, Astrakhan and Caucasian.

For radiation monitoring in the country, especially in areas contaminated as a result of an accident in Chernobyl and other radiation disasters, use a stationary network and mobile means. According to a special program, aerogammaker is also conducted by the territory of the Russian Federation.

In the framework of the ESSEM, a system of operational detection of pollution related to emergency situations.

Ecological and analytical monitoring of pollution composed of HSEMM can be divided into three large blocks: control of pollution in zones of substantial anthropogenic impact, at the regional level, on the background level.

All data from zones with any level of exposure, both emergency and generalized, at certain time intervals are enrolled in the center of collecting and processing information. For an automated system that develops currently, the primary step is a local system serving a separate area or city.

Information of mobile stations and stationary laboratories on environmental pollution by dioxins and related compounds is processed, sorted and transmitted to the next level - to regional information centers. The following data are sent to interested organizations. The third level of the system is the main center of the data, where information on the pollution of the country is generalized on the scale of the country.

The efficiency of automated processing systems for ecological and analytical information is noticeably growing when using automatic stations for controlling water and air pollution. Local automated control systems for air pollution are designed in Moscow, St. Petersburg, Chelyabinsk, Nizhny Novgorod, Sterlitamak, Ufa and other cities. Experienced tests of automated control stations are carried out in water quality in water discharge and water intake places. Instruments were created for continuous determination of nitrogen oxides, sulfur and carbon, ozone, ammonia, chlorine and volatile hydrocarbons. On automated stations controlling water pollution, temperature, pH, electrical conductivity, oxygen content, chlorine ions, fluorine, copper, nitrates, etc. are measured.

The information and analytical monitoring unit performs its basic function, since for adopting informed management decisions, the relevant authorities are important analysis and assessment of the state of the object and the dynamics of its activities. Effective information and analytical support for solving the necessary tasks are able to provide systems for automating the analytical activities of specialists of the management bodies, organize the processes of collecting, storing and processing information. The concept of this kind of systems for a wide class of managed objects should be based on modern technology Integrated data warehouses and in-depth analytical processing of accumulated information based on modern information technologies.

As already noted, traditional and generally accepted sources of primary information are statistical reporting, accounting and management accounting, financial statements, surveys, interviews, polls, etc.

The stage of analytical and statistical processing of structured primary information is also several traditional generally accepted approaches. The emergence of these approaches and systemic integration of them were due to the objective need to automate accounting and statistical work with the aim of as much as possible, qualitative and timely reflection of the processes occurring in the analyzed subject area as well as identifying their characteristic trends.

Automation of statistical works was reflected in the creation and functioning of automated statistical information systems: In the 1970s - an automated state statistical system (ASDS), and since 1988 - in the design of a single statistical information system (ESIS). The main task of these developments was the collection and processing of accounting and statistical information necessary for planning and managing the national economy on the basis of the widespread use of economic statistical methods, funds of computational and organizational techniques, communication systems in state statistical bodies.

In the structural-territorial aspect of ASDS there was a strictly hierarchical, had four levels: the Allied, Republican, regional, district (urban). At each level of information processing, it was carried out in order to implement tasks primarily of this level.

In a functional aspect, the ASDS allocates functional and security subsystems. These subsystems, regardless of the content of specific statistical tasks, implemented the functions of collecting and processing statistical information, comprehensive statistical analysis, controlling the performance of indicators, obtaining statistical data necessary for current and operational planning, timely submission to the governing bodies all the necessary statistics. From the user's point of view, monitoring tasks in their intended purpose are divided into:

regulations related to the processing of statistical reporting data at the respective structural and territorial levels of ASDS;

information and reference service tasks; Objectives of in-depth economic analysis.

Regulatory tasks related to the processing of statistical reporting data at the levels of ASDS. Each regulatory task is, as a rule, is associated with data processing of some specific form of statistical reporting or several, closely related forms of reporting. The solution of such tasks is carried out by complexes of electronic processing of information, which are a combination of software, technical and organizational tools using local information arrays.

The tasks of information and reference services provide for the formation of the necessary statistical data on requests for prompting reports, analytical notes and references, are not regulated by content. Their solutions are ensured using an automated data bank in the form of a system of accumulation, storage, search, processing and issuing information on user requests in the right form.

The tasks of in-depth economic analysis are based on the use:

dynamic series (construction of polygons, histograms of frequencies and cumulative lines, selection of trends from the selected class of functions);

smoothing the original dynamic series, diagnostics based on the selected trend and the autoregression model, analysis of residues on autocorrelation and normal)

pair regression (definition of linear and nonlinear regression equations, assessing their statistical characteristics, selection of the optimal form of communication);

multiple regression (determination of the matrix of paired correlation coefficients, determining the equations of multiple linear regression),

factor analysis (obtaining a linear model is described by a small number of factors, the calculation of the values \u200b\u200bof "loads for general factors" and the most common factors, the graphical interpretation of factors on the plane and in space);

correlation analysis (obtaining correlation matrices, medium and standard deviations).

The organizational and technological form of solving this class of tasks - analytical complexes, which are a set of packages of application programs focused on the implementation of mathematical statistical methods. To reach broad time ranges of analyzed data, a register form of monitoring is used based on automated registers, allowing to maintain and process significant sets of data organized

in the form of arrays independent of the structure of statistical reports for each object or a specific group of monitoring objects. Register monitoring form is particularly effective for statistical information characterizing relatively sustainable objects, so the registers can be considered as an automated card file of groups of homogeneous units of statistical observation of a certain type. Its application allows the user by filling out a unified request form to obtain various data characterizing a state of a particular object.

An important direction of improving statistical monitoring was to enhance the content, the accuracy and efficiency of reporting data on the basis of a combination of current reporting, one-time accounting, selective and monographic examinations, as well as optimizing information flows. Special emphasis is on the improvement of economic and mathematical methods for analyzing and predicting the development of systems. In addition, the use of new information technologies was essential in the evolution of monitoring methods, namely:

development of integrated information processing technology when using data banks and computer networks;

creating computer simulation data processing systems;

development of intellectualized end-user interface types with a computer based on automated jobs providing for the use of expert systems.

New information Technology Significantly expanded the possibility of direct automated access to the necessary statistical information, diversified the composition and content of analytical work. The possibility of integrating a single statistical information system for monitoring with other information systems of all levels of control channels of telecommunication ties.

However, all considered methods for analytical and statistical data processing have a significant drawback. The entire set of data is processed in them as the scattered set, due to which their systemic unity is missing. Meanwhile, only artificial communication can be installed by combining them into a certain reporting form. However, it is impossible to provide all forms for all possible phenomena and connections. Traditional methods of analytical and statistical data processing do not take into account that there is a natural connection between any kind of phenomena and events, based on universal inherent indicators. In the presence of a system of such natural

relations There appears to be compared with the considered phenomenon all the factors, events, data. Monitoring based on such an approach is characterized by the full coverage of causal relationships of the mutual influence of hidden trends. All this is considered in the inseparable systemic unity.

Eliminate the specified disadvantage can be eliminated by the very common approach to the problem of analytical and statistical data processing based on newest technology OLAP - Online Analitical Processing (operational data analysis).

The OLAP term denotes methods that enable real-time database users to generate descriptive and comparative data information and receive answers to various analytical queries. The defining principles of the OLAP concept include:

multidimensional conceptual representation - OLAP database must support a multidimensional data presentation, provides classical operations of partitioning and rotating a conceptual data cube;

transparency - users do not need to know that they use the OLAP database. To obtain data and accept the necessary solutions, they can use well-familiar tools. They also do not need to know anything about the data source;

availability - software must select and communicate with the best to form a response to this request source of data. They must provide automatic display of their own logic scheme into various heterogeneous data sources;

productivity has been agreed - performance practically should not depend on the number of measurements in the query. System models must be powerful enough to cope with all the changes in the model under consideration;

support for architecture client-server - OLAP tools must be able to work in a client-server environment, as it is assumed that the multidimensional database server must be accessible from other programs and tools;

equality of all measurements - each data measurement must be equivalent to both in the structure, and on operational capabilities. The main data structure, formulas and report formats should not be guided by some kind of data measurement;

dynamic processing of rarefied matrices - typical multidimensional models can easily access a large set

links to cells, many of which do not have data at a particular point. These missing values \u200b\u200bshould be stored effectively and not produce negative influence for accuracy or speed of information extraction;

support for multiple Corroductive - OLAP tools must support and encourage work in groups and sharing ideas and analysis results between users. To do this, it is very important to have multiplayer data access;

support operations between different measurements. All multidimensional operations (for example, aggregation) should be determined and accessible in such a way that they are made uniformly and consistently, regardless of the number of measurements;

intuitive data management - the data provided to the analytical user must contain all the information necessary for effective navigation (cutting sections, changes in the level of detailing information) and perform relevant requests;

flexible reporting - the user has the ability to extract any data they need and form them in any form necessary;

unlimited measurements and levels of aggregation - there should be no restrictions on the number of supported measurements.

The use of systems based on OLAP technology makes it possible:

organize a single information storage based on statistical and other reporting data;

provide simple and efficient access to the storage information with the delimitation of access rights

provide the possibility of operational analytical processing of stored data, statistical analysis;

sort, standardize and automate the creation of forms of analytical reports with data display in a given form.

The main distinguishing feature and important advantage of multidimensional presentation of data compared to traditional information techniques is the possibility of joint analysis. large groups parameters in mutual communication, which is important when studying complex phenomena.

OLAP technology significantly reduces the time of collecting and analyzing the primary information necessary for decision-making in one or another sphere of human activity, and also increases the visibility and informative of reports on processes and phenomena occurring in these areas.

OLAP systems allow you to accumulate large amounts of data collected from various sources. Such information is usually

Before creating such a system, you should consider and find out three main questions:

data accumulate and both in the conceptual level to simulate data and manage their preservation; how to analyze data;

how to efficiently download data from several independent sources.

These questions can be correlated with the three main components of the decision support system: the data warehouse server, the operational analytical data processing tool and tools for replenishing the data store.

Since the organization of information storages is the subject of other disciplines, consider only the question of analytical data processing. Currently, there are a number of OLAP tools that can be used to analyze information. These are software products like MicroStrategi 7 and, Webintelligence, Cognos PowerPlay, Alphablox like. We will review the specified products based on the following criteria:

ease of use - the software product must be simple enough for a user who does not have special training;

interactivity - the software has to implement interactive capabilities, including: viewing documents, dynamic updating of available documents, provides access to the latest information, dynamic execution of requests to data sources, dynamic unlimited "deepening to data";

functionality - the application must provide the same opportunities as traditional client / server analogs;

availability - information must be accessible to any device and workplace, and the client part is to be small to meet the different levels of the bandwidth of the user's network and respond to standardized technology;

architecture is a criterion characterizes the aspects of the software implementation of the product;

independence from data sources - the application must provide access to any type of documents and provide interactive access to relational and multidimensional databases,

performance and scalability - to ensure performance and scalability of the application, it is necessary to implement the possibilities of universal access to databases, the possibility of caching the data server and the like;

security - aspects of application administration for providing various access rights to various user categories;

the cost of implementing and administration - the cost of implementing the OLAP product per user should be significantly lower than for traditional products.

MicroStrategi 7 I. : - Software products with a wide range of functions built on a unified server architecture. User Wednesday is implemented in Misgly Strategi Web Professional.

Users are offered a number of statistical, financial and mathematical functions for integrated OLAP and relational analysis. All users provide access to both aggregated and detailed information (at the transaction level). You can perform new calculations, filter report data, rotate and add intermediate summary values, quickly change the content of the report.

The main functionality is achieved at the expense of the following funds:

MicroStrategi 7 and OLAP Services - interface to third-party products;

intelligent Cube technology - simplifies analysis and deployment, providing summary information to quickly browse in the interactive mode;

MicroStrategi NarrowCaster - gives users the ability to send indicators or pay them via the Web interface. Users can send their reports to email their reports, to schedule reports, publish them for working groups and export to Excel, PDF or HTML formats.

This product provides cross-platform support and integration, portability in UNIX, support for third-party application servers.

The product is based on the XML architecture. Users can integrate the XML code created in MicroStrategi Web to their applications, or format it in the right way.

The thin client implemented in HTML format eliminates compatibility problems with browsers, unfolds through all network security tools. The view and functions of the program can be configured for specific needs. You can embed a MicroStrategi Web to other applications running on the network.

Computers on which MicroStrategi Web works can be combined into clusters, provides scalability and reliability. It is provided to add additional equipment. if a

it happens when the task is executed, it is transmitted to another computer from the same cluster.

Data is protected at the cell level using the security filters and access control lists. Web traffic security is provided by data encryption technology at the transport level - SSL (Secire SocXet Level - the level of protected sockets).

Webintelligence -Web product to create queries, reports and data analysis. Provides network users (both intranet and extranet) protected data access to further study them and manage them. He makes analytical capabilities available for different categories users. A wide range of business analysis is provided, including the creation of complex reports, computing, filtering, detailing and aggregation.

WEBLNTELLIGENCE provides the following features:

formatting and printing reports in visual design mode;

bagatoblock reports. In complex reports to transmit comprehensive information, several tables or diagrams are sometimes necessary to place. To do this, webintelligence provides for the possibility of adding multiple blocks and diagrams to one report;

data detail capabilities in interactive mode.

The product provides a number of functions:

access to data that is stored both in traditional relational bases and on the OLAP server;

data analysis functions;

the possibility of sharing information. Webintelligence is a "thin" client, does not require installation and maintaining software applications or intermediate database software on a client location. When installing the client part, the ability to choose technology. Deploying on Microsoft Windows and Unix platforms is ensured.

Using Webintelligence, you can explore and analyze various OLAP data sources, as well as share OLAP and relational data.

The product is configured in such a way as to mostly comply with the corporate structure of any object.

Webintelligence can be performed on both one server and several NT or UNIX machines. Servers can be added to the system as needed if a failure occurs on one of the components is automatically used. Weighted load balancing between multiple servers optimizes system resources and guarantees a short response time.

Weblntelligence uses various information protection technologies. If necessary, the components are designated using digital certificate technology. To work with various network protection systems, a hypertext transmission protocol is used.

The application has a standard Web interface. The main features are supported (data sample with specified measurements and values, "recess" to data, nested cross-table, calculations, enable / disable the display of strings, columns and graphs; filters, sorting) for viewing, research, reporting and publication of OLAP data in Interactive mode.

Cognos PowerPlay provides the following functions: Apply HTML / JavaScript, which provides universal access for a user working with Netscape Navigator version 3.0 and above or Microsoft Internet Explover;

access to OLAP data of any user user; Creating and publishing BRM reports (Business Performance Management - Business Efficiency Management) in the form of PDF documents for Cognos Upfront portal, so users have access to the most important corporate data in the WEB environment;

convert data from PDF format to dynamic reports, their further research and transmission of results on UPFRONT;

the server supports work with platforms: Windows NT, Windows 2000 and above, Sun Solaris, HP / UX, IBM AIX.

Thanks to the support of the SSL POVERPLAY protocol, it guarantees data security, sent via Web. In addition, by specifying user classes, system administrators can control access to both local cubes and a Web portal shell. These classes are stored in a special, accessible LDAP (Light Directory Access Protocol - lightweight access protocol to the network directory), software component, which is responsible for centralized security management of the entire system, as well as for integrating with current protection.

Using HTML to implement client seats provides for the operation of the POVERPLAY server in the protected environment. Thereby ensures safe deployment of applications for customers, partners and suppliers.

AlphaBlox - Binder software that provides tools and layout blocks to work in the Web. Due to this, difficulties associated with the protection of network connections with databases, authorization and data formatting analytical AlphabLox platform are implemented on the basis of a standardized and 2-compatible architecture.

AlphabLox products are designed for analytical computing inside and outside the object.

Of particular interest are Java components (Voch). From these components, you can create an analytical Web application. One of the time-consuming tasks when creating an OLAP Web product is reflected and formatting data in the browser. Very often, the data must be shown as a table or chart. When creating a program using AlphaBlox to it, you can insert any number of Java components such and configure them to solve the necessary tasks by specifying certain applet parameters, thereby controlling the type and functions of the components. This software product provides the following features: access to information - data is extracted from various relational and multidimensional databases;

queries and analysis - Components perform simple and complex requests to various data sources, while not required to programming on CQL;

presentation is the ability to submit data in various formats (as reports, tables, diagrams).

Java components have a modular structure and can be used repeatedly. They can be applied when implementing analytical capabilities for a plurality of business functions. Since they are controlled by a set of parameters, their properties can be changed using a text editor. This ensures flexibility in developing and upgrading an analytical solution. Components can be configured to meet certain business requirements and reuse, implementing additional applications in other areas of activity. Application developers can write additional code on JSP, JavaServlets or JavaScript.

ALPHABLOX solutions use services provided by the application server and the Java Runtime Environment Environment (JRE), any Java extensions or customized extensions designed for this platform.

The structure of AlphabLox applications is based on standards and allows for integration with existing operating systems, transactional infrastructure, with traditional systems. It provides user access to data from various sources and subsequent analysis of them.

AlphaBLOX uses standard resources and application server capabilities, including HTTP / caching and monitor / processes, as well as integration with Web servers. In addition, a 12-compatible architecture eliminates unnecessary page updates and allows the main logic on the server.

AlphaBLOX uses the same protection model, and the application server implemented by standard functions J2EE platforms. This eliminates the need to create an independent model of the protection mechanism.

Easy deployment is one of the main advantages of the Web application. This fully applies to Alphablox applications. However, they require certain versions of browsers and a Java platform, while the thin HTML client works in most browsers.

Operational analysis of data based on OLAP technology allows analysts, managers and performers to delve into the data using a fixed, general, interactive access to a wide variety of possible data formats that were obtained from raw data to reflect the actual position of the object in the form of useful users. OLAP functionality is characterized by a dynamic multidimensional analysis of the objective data of the object required to support the end user with analytical actions, including calculus and modeling, applicable to data by analyzing the trend over sequential time intervals, performing a data from a gidmrench data to view on the screen, changes in the level of detailing information in more deep levels generalizations and the like.

OLAP funds are focused on ensuring multidimensional information analysis. To achieve this, multidimensional storage models and data representations are used. Data is organized in cubes (or hypercubs) defined in multidimensional space consists of separate measurements. Each measurement includes many levels of detail. OLAP type operations include operations for changing the level of information presentation (promotion up and down the measurement hierarchy), selecting certain parts of the cube and reorienting the multidimensional data representation on the screen (receiving a consolidated table).

For OLAP databases, an ARV-1 reference test was developed. This test simulates the real situation for OLAP server software. The standard defines a set of measurements that determine the logical structure. The logical structure of the database consists of six meters: time, script, measure, product, customer and canal. The reference test does not provide for a specific physical model: the input data is provided in the format of ASCII files. Test operations carefully simulate standard OLAP operations over large data volumes that are consistently loaded from internal or external sources. These operations include information aggregation, data detail on the hierarchy, calculating new data based on business models and the like.

The possibilities of OLAP technology is the basis of the organization and multidimensional analysis of monitoring information. Consider the stages of this process.

Before you upload information to a multidimensional monitoring database (BBD), it should be removed from various sources, clean, turn and consolidate (Fig. 1.3). In the future, the information must be updated periodically.

Fig. 1.3.

Data retrieval is the process of sampling data from operating databases and other sources. Analysis of the available sources of information shows that most of them are presented in the form of tabular data obtained or in electronic or in print. Modern scanning and image recognition means allow you to almost completely automate this data preparation phase.

Before you make information into the database, it is necessary to clean it. Typically, cleaning provides for the filling of missing values, adjusting typos and other errors allowed when entering errors, defining standard abbreviations and formats, replacing synonyms with standard identifiers and the like. Data that are defined as false and cannot be corrected are discarded.

After cleaning the data, you must convert all the information received to the format that will meet the requirements of the software used (OLAP server). The transformation procedure acquires particular importance when it is necessary to combine the data received from several different sources. This process is called consolidation.

The information download step in the BBD is to create the necessary data structure and filling out its information obtained at the previous data preparation stages.

Removing information from the BBD allows Microsoft SQL Server Analysis Services, which is simultaneously the supplier of both multidimensional data (Tabular Data Provider) and tabular data (Tabular Data Provider) (TABULAR DATA PROVIDER). Thus, the execution of the query returns or a multi-dimensional data set, or the usual table of depending on the query language used. Analysis Services supports both SQL and MDX extensions (Multidimensional Expressions).

SQL queries can be transmitted to Analysis Services using data tools such:

Microsoft Ole DB and OLE DB for OLAP;

Microsoft ActiveX Data Objects (ADO) and ActiveX Data Objects Multidimensional (ADO MD).

OLE DB for OLAP expands OLE DB capabilities, including objects specific to multidimensional data. ADO MD expands the ADO in the same way.

Microsoft SQL Server Analysis Services allows you to fill with MDX extensions that provide rich and powerful query syntax to work with multidimensional data, stored OLAP server in cubes. Analysis Services supports MDX functions to determine the calculated fields, construct local data cubes and executing queries using the Pilot Table Services component.

It is possible to create custom functions that work with multidimensional data. Interaction with them (transmission of arguments and return results) occurs using MDX syntax.

Analysis Services provides more than 100 embedded MDX functions to determine complex calculated fields. These functions are divided into the following categories: work with arrays; work with measurements; work with hierarchies; work with hierarchy levels; logical functions; Working with objects; numeric functions; work with sets; work with rows; Working with tuples.

It is possible to create local cubes designed to view on computers where the OLAP server is installed. Creating local cubes requires the use of MDX syntax and runs through the Pilot Table Services component (Pilot Table Services), which is the OLE DB client OLAP server. This component also does offline work with local cubes in the absence of a connection with an OLAP server, providing an OLE DB data source interface. To create local cubes, CREATE CUBE and INSERT INTO operators use.

MDX query language, which is an extension of SQL, allows you to query cubes with data and return the result in the form of multidimensional data sets.

Also, as in the usual SQL, the creator of the MDX request must first determine the data set structure, returns. In most cases, the creator of the MDX query is returned to the data set in the form of multidimensional structures. Unlike the usual SQL query, which operates with the tables to obtain a two-dimensional set of records, the MDX request deals with cubes to form a multidimensional product dataset. It should be noted that the MDX request can return and two-dimensional data sets that are a particular case of a multidimensional dataset.

Visualization of multidimensional data sets can be sufficiently heavy. One of the visualization methods is to limit the supply of a flat, two-dimensional table using a plurality of nested measurements along one axis. Such nesting will lead to the appearance of subtitles.

Pilot Table Services, which is part of Microsoft SQL Server Analysis Services, is an OLAP server designed to gain access to OLAP data. This component functions as Analysis Services client.

The functions of Pilot Table Services are to analyze the data, building cubes and in optimal memory management. The component provides an interface to multidimensional data. It is possible to save data in the local Cuba on the client's computer and the subsequent analysis without connecting to the OLAP server. Pilot Table Services is needed to perform the following tasks:

establishing a connection with an OLAP server as a client component;

provision of OLE DB interface programs with OLAP extensions;

functioning as a tabular data source, supports a subset of SQL;

functioning as a multidimensional data source supports MDX expansion;

creating a local data cube;

functioning as a mobile desktop OLAP client.

The consolidated table component can only work with one local cube section. It also does not have a built-in control system for the provision of information. Therefore, the performance of Pilot Table Services is directly proportional to the volume of the data to which it is addressed.

It should be noted that the OLAP interface is simple and requires knowledge no more than a spreadsheet. OLAP allows you to use various forms Reports, Interactive Data Analysis Interface and the ability to generate printed forms. However, compared with the traditional methods of programming and generating custom OLAP reports, not only one hundred times reduces programming costs, but also changes the principle of the user's work with the report.

The difference between OLAP as a report generation tool lies in the ability to automatically and interactively perform such operations with data:

recursive data grouping; calculations of intermediate outcomes according to subgroups; calculations of final results.

Commands for performing these operations are given by the user. The sections of the table used are as controls. When the user changes the report form (for example, travels), the system performs the calculations of the intermediate results and reflects the new report.

Additionally, the user can change the sorting and filter by arbitrary data combinations, see the data in percentage terms, change the scale and perform other necessary report conversion (these features are not an indispensable attribute of OLAP technology, and depend on the specific tool implementation).

As a result, the user can independently, intuitively understandable to it from the available data set, to form all the types of reports for this set. It helps overcome the eternal limitations of information systems, which is that the power of the interfaces is always below the database power.

OLAP technology allows you to implement almost all possible types of table image of the contents of the database. If the product is quite flexible, then the programmer's task is a description of the semantic layer (dictionary), after which a qualified user can independently create new cubes, operating the terms of the subject of the subject sphere known to it. The remaining users can form reports for each Cuba.

Thus, OLAP technology serves both developers and users in all cases where you need to see information in the form of tabular reports in which the data is grouped, and for the groups the final indicators are calculated.

Experience shows that it is not enough to provide users with a large cube consisting of a variety of measurements and facts. This is due to the following reasons.

First, at each moment the user needs a completely definite report.

Secondly, some algorithms for calculating the results are described by complex formulas, and the user may not have sufficient qualifications to determine them.

Thirdly, the OLAP report may have a specific specified by the author of the reporting methodology, the arrangement of measurements and the initial sorting conditions.

Fourth, in many cases it is easier to understand the data, if you look not to a table with numbers, but on the chart. To configure the OLAP diagram, it is sometimes necessary to have a good spatial imagination, as a cube with a set of measurements must be reflected as a set of figures or lines in a three-dimensional figure. The number of properties of modern graphic components is calculated by thousands, so the pre-configuration of the chart or graphics for the OLAP report may take a long time.

Fifth, as for any other report, it is important for the OLAP report, its spectacular design, which includes the settings of headers and signatures, colors and fonts.

Thus, for a comfortable operation of the user, the OLAP report must contain a certain set of application metadata describing the aggregation algorithms, preliminary filtering and sorting conditions, headlines and comments, the rules of visual design.

With visualization of multidimensional cube information, a significant factor is to streamline measurements according to their similarity. The basic idea is that measurements that characterize similar parameters are located nearby. To determine such measurements, various clustering methods are used, in particular, heuristic algorithms can be used.

The described information and analytical technology is not the only possible. But all of them are the development of Business Intelligence (WS), the assignment of which is the collection, systematization, analysis and presentation of information. The choice of specific information and analytical technology remains for the user, taking into account the peculiarities of the subject area.

Did you like the article? To share with friends: