Nach oben pdf A design and implementation of a data warehouse for research administration universities

A design and implementation of a data warehouse for research administration universities

A design and implementation of a data warehouse for research administration universities

OLAP (On Line Analytical Processing) tools (3) then are used to process operations on the data stored in the DW. These tools operate on the multidimensional view of the data (hyper cube) and provide roll up, drill down, slice. . . operators allowing the end-user to navigate along and inside the hyper cube. Other exploitation operations can be applied on data warehouse such as data mining techniques implementing knowledge discovery. Data warehousing is today very much used in business field. Many industrial products are successfully marketed : ETL (Extraction-Transformation-Loading) tools, dedicated DBMS and OLAP systems [OLA]. In Figure 2, one can see that in France, the public sector is backward business fields in data warehousing.
Mehr anzeigen

9 Mehr lesen

Standard implementation trajectories for sustainable product design: A configurational approach

Standard implementation trajectories for sustainable product design: A configurational approach

Abstract While sustainability issues increasingly gain importance for new product design, they also further complicate the NPD process. In many cases it is hard to exactly measure the socio-environmental impact of new products, and sustainability targets may conflict with other ones. Innovators can aim to manage these challenges by turning to voluntary sustainability standards (VSS), like the practices and certificates that come with the EU Ecolabel, Greenguard or Cradle to Cradle standards. VSS are predefined rules, procedures, and methods for common and voluntary use and focus on social and environmental performance. It is proposed that the local implementation of these general standards from outside the organization will likely lead to a variety of firm-specific implementation trajectories, ultimately leading to different levels of VSS implementation extensiveness across firms. This variety that is not sufficiently addressed in extant research, is researched in the current study. Using organizational learning as theoretical lens this study investigates configuration(s) of factors, including the embeddedness of the relationship between the focal firm and standard specific organizations that drive VSS implementation extensiveness. In doing so, it uses the configurational research approach fuzzy set Qualitative Comparative Analysis (fsQCA). Empirically the study draws on qualitative and quantitative data from an international collection of firms that implemented the Cradle to Cradle standard. The study shows that VSS are multifaceted and that configurations that consistently drive VSS certification extensiveness differ from the ones that drive VSS practice implementation extensiveness. Additionally, it is found that configurations that consistently lead to the absence of high implementation extensiveness do not simply mirror the ones for high implementation extensiveness but have unique properties. Finally the study illustrates that similar levels of implementation extensiveness can result from multiple distinct configurations. The study mainly contributes to extant research on sustainable product design and how to integrate general principles of sustainable design into the NPD process.
Mehr anzeigen

26 Mehr lesen

Proactive Data Quality Management for Data Warehouse Systems - A Metadata based Data Quality System

Proactive Data Quality Management for Data Warehouse Systems - A Metadata based Data Quality System

3 Conclusion The proposed data quality system provides an approach for data quality planning and data quality measuring in data warehousing. The system establishes, as one of the key components for proactive data quality management, the foundation for ensuring high level data quality in data warehouse systems. It provides a way of stating data quality requirements through a rule base (e.g. integrity constrains) and measuring the current quality level by applying these specified rules. The implementation of the system in a large Swiss bank shows the practical capability to measure data quality. The measured quality statements provide quality information to end-users, who do not need detailed knowledge about technical data models and transformation processes. The article shows the basic concept of the components and the rule bases of the data quality system. The practical experiences show that the rules have to be stated by the end-user and may become very complex. Only they are able to state their quality requirements in a non-formalized way and have the business knowledge to understand the data (e.g. semantic constrains). Even if the system is implemented further research is required. For example more detailed research is needed for stating quality rules (e.g. data mining and statistics). Common organizational concepts and methods for data warehouse systems have to be enhanced by data quality aspects (DQ-Officer, information requirements analysis). This approach could be applied to other fields like e-commerce, logistics and knowledge management.
Mehr anzeigen

10 Mehr lesen

Exploring the Structural Dimension of Data Warehouse Organizations: Results of a Survey and Implications

Exploring the Structural Dimension of Data Warehouse Organizations: Results of a Survey and Implications

In total, responses were received from 66 organizations including 8 insufficiently completed questionnaires. Thus, the usable response rate was 39.7% (58 valid responses of 146 conference attendees). Approximately two third of the participating companies were from banking (32%), insurance (17%) and telecommunication (16%) sector. Accordingly 72% of the participating companies had more than 999 employees and only 16% were SMEs. On average, respondents had about 6.58 years experience in data warehousing and 39 employees were engaged in data warehousing. Further details about the population characteristics are given in the appendix. As already described above, the survey focuses on the structural dimension of data warehouse organizations. Since the early 1990s there is a trend towards business processes as the primary design objects of the structural dimension of organizations (e.g. Hammer/Champy 1993, Davenport 1993, Oesterle 1995). Therefore this paper applies a process-oriented view called data warehousing to data warehouse systems. In literature no generally accepted definition of the term data warehousing exists, in particular number and naming of the process phases varies. For example Gardner (1998) differentiates the phases planning, design/implementation, and usage/support/enhancement. Vassiliadis et al. (2001) separate the process in the three phases design/implementation, administration, and evolution. In order to reduce complexity for conducting the survey data warehousing is separated in only two main phases ‘development’ and ‘maintenance’. This differentiation was used in all questions concerning the data warehouse process.
Mehr anzeigen

9 Mehr lesen

Design and implementation of a microservice for deletion of resources in the Multi-Agent Research and Simulation distributed system

Design and implementation of a microservice for deletion of resources in the Multi-Agent Research and Simulation distributed system

Distributed systems play a major role in our lives. They are widely used in areas such as the Internet, healthcare, education, science, eCommerce, financial trading and others. The prime motivation for constructing and using distributed systems is the desire to share resources. The term ‘resource‘ is characterized by the range of things that can be usefully shared in a networked computer system. The definition spans from hardware components such as powerful processors and storage devices to software-defined entities such as files, databases and data objects of all kinds. It includes the stream of video frames, produced by a digital video camera, and the audio connection that a mobile phone call represents. However, there are challanges when designing and building distributed systems. A major concern is concurrency. The presence of multiple processes and users in a distributed system is a source of concurrent requests to its resources. Each resource must be designed to be safe in a concurrent environment [1].
Mehr anzeigen

63 Mehr lesen

A framework for mobile agents in peer-to-peer networks - design and implementation

A framework for mobile agents in peer-to-peer networks - design and implementation

Mobile agents are transferable pieces of software which can solve tasks and travel through the network autonomously. Normally they are deployed in so called sea-of-data applications, in which data is distributed over lots of systems or may not leave the sys- tems due to privacy concerns or other security restrictions, like medical data in hospi- tals. Research to such systems has been done intensively, but only under the premise that central security policies and control can be established. An example for such archi- tectures is MARISM-A 1 . However, a public peer-to-peer-network is a different scenario, requiring other security strategies and leading to other problems. But mobile agents are a good solution for adding functionality, because new functionality can be introduced by simply writing and deploying new mobile agents, which will roam the peer-to-peer network searching for specific data or doing calculations on the user's behalf. Further- more, agents may roam the network while the user's peer is offline, making it very at- tractive for emerging technologies like UMTS, with much higher connection fees than with traditional connection types.
Mehr anzeigen

99 Mehr lesen

A data network for health e-Research

A data network for health e-Research

Turning to our design for describing services, our architecture is strongly influenced by developments in shared statistical metadata such as arising from the FASTER project and its precursors [FAS]. Distributed multi-party privacy protection protocols have been re- cently proposed for the linkage of clinical data at an individual level [KBH02, Chu03]. A trusted third party is required to provide these linkage services. Our architecture sup- ports the implementation of these protocols in a form more readily available to researchers. Currently, we have developed linking services based on the algorithms implemented in the Febrl package[CC03]. Remote access data laboratories offering on-line access to ”Con- fidentialised Unit Record Files” can also be mapped onto our service oriented architect- ure [ABS06]. In contrast, we are proposing to enable linking and analysis of raw datasets in a secure environment with strict authorisation procedures and confidentialisation of re- leased data products.
Mehr anzeigen

12 Mehr lesen

Design and implementation of a generic modeling framework - a platform for integrated land use modeling

Design and implementation of a generic modeling framework - a platform for integrated land use modeling

Land-use as “the total of arrangement, activities and inputs that people undertake in a certain land cover type”, in contrast to land-cover being the “observed physical and biological cover of the earth’s land, as vegetation or man-made features” (FAO, 1999), is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change (Lambin et al., 2000). Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Concerns about land-use and land-cover change first emerged on the agenda of global environmental change research several decades ago when the research community became aware that land-surface processes influence climate (Lambin et al., 2006). While the focus in the beginning lay on the surface-atmosphere energy exchanges determined by modified surface albedo (Ottermann, 1974; Charney and Stone, 1975; Sagan et al., 1979), the view later on shifted to terrestrial ecosystems acting as sources and sinks of carbon (Woodwell et al., 1983; Houghton et al., 1985). A broader range of impacts of land-use change on ecosystems was identified since then. Besides being a major influencing factor on climate (Brovkin et al., 1999), land-use meanwhile is regarded the most important factor influencing both biodiversity and biogeochemical cycles on the global scale (Sala et al., 2000). To close the circle, land-use itself is strongly influenced by environmental conditions like climate and soil quality, affecting e.g. suitability for certain crop types and thus affecting agricultural use or biomass production (Mendelsohn and Dinar, 1999; Wolf et al., 2003).
Mehr anzeigen

72 Mehr lesen

Design and implementation of pay for performance

Design and implementation of pay for performance

A substantial literature discusses potential problems arising from subjective evaluation (Murphy 1992; Prendergast & Topel 1993; Murphy & Cleveland 1995; Prendergast 1999). Because the evaluation is at the supervisor’s discretion, supervisor preferences and incentives may play a role in the evaluation. A supervisor might engage in favoritism toward some employees, or manipulate the evaluation process to reduce compensation costs. Such behavior might expose the firm to legal liabilities from lawsuits alleging discrimination or wrongful termination. Subjective evaluation can also distort incentives. The employee might act as a “Yes Man,” behaving in ways the supervisor prefers and that correlate imperfectly with firm value (Prendergast 1993). That employee may try to manipulate the evaluation by making negative evaluations personally costly for the supervisor (Milgrom 1988; Milgrom & Roberts 1988). As a response to such influence costs, the supervisor might be too lenient or reduce the variance of evaluations (leniency and centrality biases). Supervisors might also have hindsight bias, holding the employee responsible for factors that are known by the supervisor ex post, but that the employee did not know at the time he per- formed the work. Finally, subjective evaluations have their own form of uncontrollable risk for the em- ployee: they are difficult to verify and enforce contractually, so they require relational contracting and adequate trust of the supervisor (Baker, Gibbons & Murphy 1994).
Mehr anzeigen

39 Mehr lesen

Design and Implementation of a Middleware for Uniform, Federated and Dynamic Event Processing

Design and Implementation of a Middleware for Uniform, Federated and Dynamic Event Processing

The BE-tree is a dynamic index structure that can be updated at runtime. It makes no assumptions about frequencies of data items and index updates. Thus, index up- dates with high frequency must be supported. The BE-tree achieves this by fast local reorganization. Space clustering affects only single partitions and is deterministic (see Definition 8). Consequently, local reorganization is sufficient to achieve an optimal space clustering of each partition. In contrast, space partitioning affects the entire BE-tree and is not deterministic. But to achieve fast adaption, space partitioning is also performed only locally by splitting overflowing leaf nodes. A globally optimal space partitioning (according to the scoring function) is not achieved by locally opti- mal space partitioning in general. In fact, the global space partitioning depends on the insertion order of indexable Boolean expressions. To handle this problem, the original BE-tree proposes a self-adjustment technique. In this approach, leaf nodes are con- tinuously scored. If the score of a leaf node drops below some threshold, then a leaf node is entirely or partly deleted and all affected indexable Boolean expressions are reinserted. The problems of this approach are as follows. First, it requires additional monitoring at runtime (insertions, deletions and matches of each l-node must be mon- itored in detail) which contradicts the strict performance requirements of EP. Second, there are no guarantees whether and when a globally optimal partitioning is achieved. Third, the reinsertion might happen in a suboptimal insertion order again.
Mehr anzeigen

419 Mehr lesen

Research through DESIGN through research - a problem statement and a conceptual sketch

Research through DESIGN through research - a problem statement and a conceptual sketch

uncertainty and unpredictability (Schön 1983) will be subsequently replaced by well- grounded knowledge. But exclusively scientific research is unable to fully recognize the implications of acting in a space of imagination and projection, where design criteria only become apparent after the outcome has been designed. Therefore the "knowledge base position" needs to be complemented by the "unknowledge base position" (Jonas, Chow, Verhaag 2005) or by the competencies to deal with not-knowing (Willke 2002). It is not science as a method, but science as a guiding paradigm for design, which is being called into question. Examining design as processes in the course of socio-techno- cultural evolution will reveal more clearly what is impossible and will enable us to identify the stable islands of reliable knowledge. This view adopts the circular and reflective "trial & error" models of generative world appropriation, as put forward by Dewey (1986), von Foerster (1981), Glanville (1982), Schön (1983), Swann (2002) and many others. Furthermore the hierarchical separation of basic / applied / clinical
Mehr anzeigen

10 Mehr lesen

Design and implementation of a synthetic pre-miR switch for controlling miRNA biogenesis in mammals

Design and implementation of a synthetic pre-miR switch for controlling miRNA biogenesis in mammals

For a long time it has been assumed that mainly protein factors are relevant for gene regulation. In the meantime it became obvious that functional RNAs play a key role in determining differential gene expression and thus constitute another group of relevant trans regulators. Although the existence of non-coding (nc) RNAs has been known for many years, their meaning for the organism still has to be elucidated in large part 28 . Many of these ncRNAs are transcribed at very low level and no specific function has been attributed to them yet. Still, there is also increasing evidence that some of them are strongly involved in a multitude of cellular processes 29 . Due to their activity in gene expression regulation especially the small ncRNAs are of great interest. The 20-30 nucleotide long main representatives of this class have been categorized according to origin, function and associated effector proteins in Piwi-interacting (pi) RNAs, small interfering (si) RNAs and micro (mi) RNAs. These molecules are exclusively present in eukaryotes and constitute the key players of RNA interference (RNAi) 30 . RNAi is a biological mechanism that was first discovered in 1993 in Caenorhabditis elegans, when Ambros and coworkers found the miRNA lin-4 to be an active endogenous regulator of developmental genes 31 . The term “RNAi” was later defined by Mello and Conte 32 . Over the years, this mechanism has turned out to be present in a huge variety of eukaryotic organisms, namely plants, fungi and animals, where it leads to the targeted silencing of specific genes during development or serves as defense mechanism from invasive nucleic acids 33 . The small ncRNAs exhibit a common mode of action that is the specific binding of a ribonucleoprotein complex to its target gene (reviewed in 34 ).
Mehr anzeigen

122 Mehr lesen

Das Common Warehouse Metamodel - ein Referenzmodell für Data-Warehouse-Metadaten

Das Common Warehouse Metamodel - ein Referenzmodell für Data-Warehouse-Metadaten

[Me03] Melchert, F.: Das Common Warehouse Metamodel als Standard für Metadaten im Data Warehousing. In: von Maur, E.; Winter, R. (Hrsg.): Data Warehouse Management. Das St. Galler Konzept zur ganzheitlichen Gestaltung der Informationslogistik. Springer, Berlin et al. 2003. S. 89-111.

5 Mehr lesen

A Geoprivacy by Design Guideline for Research Campaigns That Use Participatory Sensing Data

A Geoprivacy by Design Guideline for Research Campaigns That Use Participatory Sensing Data

Spatial-point aggregation (Adrienko & Adrienko, 2011; Monreale et al., 2010), or spatial-areal and temporal aggre- gation, known also as cloaking (Cheng, Zhang, Bertino, & Prabhakar, 2006; Gruteser & Grunwald, 2003; Kalnis, Ghinita, Mouratidis, & Papadias, 2007), follows the same approach as statistical aggregation. In particular, it decreases the precision of original data. Point aggregation can be used for both privacy protection and a generalization approach to visualize flows in movements and in between areas. With cloaking, the time duration of an object at one location is considered as quasi-identifier. Given the number of other objects at this location and for this time duration, a decision to decrease spatial resolution will be taken. Similarly, one can lower the temporal resolution. Because cloaking is designed for LBS data, the anonymity it offers is calculated based on the number of other data subjects (i.e., users of a service) at a particular time and location. Considering the number of users of a LBS, this approach can provide suffi- cient anonymity. However, the number of participants in participatory sensing studies will probably be much lower, and this will greatly affect the anonymized dataset’s spatial precision due to larger disclosed regions and/or coarser time. Generally, all techniques that involve some sort of spatial aggregation will affect analytical results due to the modifiable areal unit problem (Openshaw & Openshaw, 1984). In practice, polygon or point clusters of the measure- ments’ values may appear or disappear depending on the aggregation’s division of the space.
Mehr anzeigen

20 Mehr lesen

Design and implementation of a verifier for sequential programs using the Hoare calculus

Design and implementation of a verifier for sequential programs using the Hoare calculus

to find all possible execution paths. Those are then checked against constraints, e.g. invariants like a combination of variable values that should never occur. If an execution path is discovered that violates these conditions, it will be exposed and the programmer can observe the execution path that lead to the error to hopefully fix its root cause. This approach basically traverses all accessible variations of execution before it marks the program as approved. That is why there is an exponential growth in computation time and required memory involved, rendering the algorithm impractical fairly quickly. Model checking also necessitates a closed, finite system (or an algorithm to render it as such). Otherwise the program could keep allocating memory and the number of states would not exhaust, thus the verifier may never come to a conclusion. Dynamic data structures may be examined by dedicated methods like shape analysis [Wik17f]. Both the model and the specifications are described in appropriate languages that allow the verifier to work with. Due to the intent of exploring the state space, the modeling language must project a finite state machine. Examples are PROMELA (Process
Mehr anzeigen

114 Mehr lesen

publish.UP Design and implementation of non-photorealistic rendering techniques for 3D geospatial data

publish.UP Design and implementation of non-photorealistic rendering techniques for 3D geospatial data

This section presents a technique for image stylization that employs (re-)colorization and non-linear image filtering to devise artistic renditions of 2D images with oil paint characteristics. Rather than attempting to simulate oil paint via aligned strokes (Haeberli 1990 ; Hertzmann 1998 ; Hays & Essa 2004 ; Zeng et al. 2009 ) or through physically-based techniques (W. Baxter et al. 2004 ; Lu et al. 2013 ), the proposed method formulates I1 to I3 as sub-problems of image filtering (Fi- gure VII.7). The first problem is solved by performing a recolorization, using the optimization-based approach of Levin et al. ( 2004 ), with the dominant colors of the input image for quantization. This approach produces more homogeneous color distributions than local image filtering techniques and gives users more control in refining global color tones. The second problem is solved using the smoothed structure tensor (Brox et al. 2006 b ), which is adapted to the feature contours of the quantized output, together with principles of line integral con- volution (Cabral & Leedom 1993 ) and Phong shading (Phong 1975 ) to obtain a flow-based paint texture in real-time. Finally, the third problem is addressed by an interactive painting interface that implements GPU-based per-pixel para- meterizations via virtual brush models to give users local control for adjusting paint directions, shading effects, and the LoA. The provided approach provides versatile parameterization capabilities to resemble paint modes that range from high detail to abstract styles.
Mehr anzeigen

193 Mehr lesen

Datenqualitätsmanagement in Data Warehouse-Systemen

Datenqualitätsmanagement in Data Warehouse-Systemen

Historisierung: In Data Warehouse-Systemen gängige Art der Datenspeicherung, bei der ein Tupel oder Objekt in einer Datenbank im Zuge eines Update nicht überschrieben, sondern mit einem [r]

292 Mehr lesen

Design and implementation of a dedicated planning and implementation software system using a driving simulator to support occupational therapy intervention and data collection in the context of studies

Design and implementation of a dedicated planning and implementation software system using a driving simulator to support occupational therapy intervention and data collection in the context of studies

The Protocol Manager does not use any external identity management systems but stores user data in the server’s own database. Since passwords are not stored but their hash values, the following password related columns are included in the user table: password hash, hashing algorithm, salt, and number of rounds. Per default the application uses a SHA-256 [28] hash, salted with a 32 byte long random sequence, which traverses hashing for 200 rounds. In addition the client has to request hashing instructions prior authentication which contains informa- tion to pre-hash the password by splitting the number of rounds between client and server. This way no plain text password has to be sent over the network. Because users tend to use the same password for different systems, this mecha- nism protects the user’s plain password in the worst cases of a man-in-the-middle attack or server compromising.
Mehr anzeigen

88 Mehr lesen

Implementation of a circular product design – case study of a smartphone manufacturer

Implementation of a circular product design – case study of a smartphone manufacturer

Concepts had already emerged in the second half of the past century as predeces- sors of the CE. Increasing attention for a conscious handling of resources resulted from rising environmental damages that the widespread report “Limits of Growth” (Meadows et al. 1972), initiated by the Club of Rome, had outlined. Changes in the socio-economic and regulatory landscapes, for example the change in resource price volatility which was caused by growing modern economies, and the burgeoning of middle-class consumers entering the market, caused people to question the feasibil- ity of traditional, linear operating economy following the ‘take-make-dispose’ ap- proach (Accenture 2014; World Economic Forum, Ellen MacArthur Foundation, & McKinsey & Company 2014). Focusing on characteristics of self-reinforcing and re- generation, the Ellen MacArthur Foundation (EMF) has developed the most recog- nized and comprehensive approach to describe a CE bringing together different schools of thought and disciplines. At all times, keeping the highest value of prod- ucts, components or resources is in the focus.
Mehr anzeigen

65 Mehr lesen

Towards a nationwide implementation of a standardized nutrition and dietetics terminology in clinical practice: a pre-implementation focus group study including a pretest and using the consolidated framework for implementation research

Towards a nationwide implementation of a standardized nutrition and dietetics terminology in clinical practice: a pre-implementation focus group study including a pretest and using the consolidated framework for implementation research

Identified criteria (points to consider) in terms of barriers and facilitators could be linked to interventions and re- sponsibilities applying a structured framework to inform a targeted nationwide implementation strategy. Using the codebook of the CFIR provided us with constructs and definitions for organizing qualitative data. The main bene- fits of this approach, such as to have an a priori templates, is that it accelerates the coding process and generates comparable results. The first coding and developing intervention-specific themes helped us to overcome the disadvantages of attending predefined constructs, namely, of missing important aspects. From our findings in terms of in-depth interviews, we conclude that focus groups were a suitable method to identify barriers and facilitators regarding the implementation of a standardized termin- ology. We found, that focus groups allowed the partici- pants to discuss and question perspectives of colleagues, that, in turn raised some more important perspectives that could be investigated in addition, for example in terms of Core Set development and assessment instruments.
Mehr anzeigen

19 Mehr lesen

Show all 10000 documents...

Verwandte Themen