Nach oben pdf Extending the reach and power of deductive program verification

Extending the reach and power of deductive program verification

Extending the reach and power of deductive program verification

The aviation industry, which has a high level of reliability in all its systems, is a good example. A very important measure used to achieve this reliability is the careful investigation and analysis of accidents (failures) and immediate feedback to design and operation. The use of well-matured technology also contributes to keeping the reliability level [Sakugawa et al., 2005]. Aviation industry also has universal regula- tions for the use of software in airborne systems. One part of these regulations is the guideline DO-178B [RTCA, 1992]. It lists objectives (for different levels of criticality) that a piece of software must satisfy in order to be certified for airborne use. With the increasing level of criticality, the total number of objectives increases, as well as the number of objectives that have to be satisfied “with independence”, i.e., the validation activity has to be performed by a person other than the original developer. The main activity used to validate avionics software is rigorous testing. Reasoning-based formal methods are permitted but neither required nor sufficient by themselves. In general, DO-178B states that “formal methods are complementary to testing”.
Mehr anzeigen

191 Mehr lesen

Deductive verification of object-oriented software : dynamic frames, dynamic logic and predicate abstraction

Deductive verification of object-oriented software : dynamic frames, dynamic logic and predicate abstraction

In JavaDL, the verification of modifies clauses is based on the notion of location dependent symbols [Roth, 2006; Engel et al., 2009]. As in the loopInvariant rule, the approach used in this chapter instead uses quantification over locations (in the formula frame). Besides supporting dynamic frames and allowing the cre- ation and initialisation of new objects without this being declared in the modifies clause, a secondary advantage of the JavaDL* approach is that it is most proba- bly easier to understand for the user of the verification system than the approach of JavaDL. Boogie-based verifiers typically use yet another approach, where for every assignment statement it is checked separately that the assigned location is covered by the modifies clause. This approach facilitates user feedback in case the modifies clause is violated, because it makes it easy for the verification system to pinpoint the responsible assignment. An advantage of our approach is that it is more liberal, in that it tolerates temporary modifications (see also the discussion in Subsection 2.2.2). An additional, more pragmatic reason for using frame in- stead of performing a separate check for every assignment is that this technique fits more naturally into dynamic logic: classically, dynamic logic supports only a postcondition at the end of the verified program, but not in-program asser- tions that could be attached to individual assignments. An augmented version of dynamic logic that features in-program assertions has been defined by Ul- brich [2010]. In such a dynamic logic, per-assignment checks for modifies clauses become a viable option.
Mehr anzeigen

294 Mehr lesen

Advancing Deductive Program-Level Verification for Real-World Application: Lessons Learned from an Industrial Case Study

Advancing Deductive Program-Level Verification for Real-World Application: Lessons Learned from an Industrial Case Study

In the case of the PikeOS microkernel, implementations of single C functions are deliberately kept simple to facilitate maintainability and certification measures – the functionality of the whole kernel is rather implemented by interaction of many of these small functions, operating on common data structures. Microkernels, and more generally, all operating systems, have to keep track of the overall system’s state, resulting in rela- tively large and complex data structures on which many of the kernels functions operate conjointly. This amount of interdependencies has an impact on the following issues. Issue B-1: Entangled Specifications. Function specifications have strong depen- dencies on each other. Finding the right annotations for a single function requires the verification engineer to consider several functions at once, due to these dependencies. Good feedback of the verification tool in case of a failed verification attempt is essential. Feedback given by annotation-based verification tools so far only focuses on the function currently being verified. This allows the user to pinpoint and fix bugs in the specifica- tion or program respectively to change auxiliary annotations, e.g., loop invariants or contracts for called functions. For the verification of single functions, this tool support is sufficient. However, current verification methodologies often do not provide adequate assistance for the user in case of analyzing problems with interdependent specifications. Example. One example for the interplay between specifications of different implemen- tation parts due to dependencies between functions is examined in detail in Chapter 7 . Regarding the complexity of system software, Klein et al. [ Kle+14 ] note that “. . . OS kernels . . . generally feature highly interdependent subsystems” (citing Bowman et al. [ BHB99 ]). That this is not an exclusive property of system software or microkernels in particular but usual in non-trivial software projects in general is demonstrated by empirical studies on software complexity metrics – for example, Bhattacharya et al. [ Bha+12 ] examine some metrics of eleven open source software projects of significant size. One of the characteristics analyzed is the diameter of the static call graph (defined as “the longest shortest path between any two vertices in the graph”). For the projects examined, the authors note that for the diameter “the typical value range (10–20) is similar across all programs.”
Mehr anzeigen

190 Mehr lesen

Deductive Verification of Concurrent Programs and its Application to Secure Information Flow for Java

Deductive Verification of Concurrent Programs and its Application to Secure Information Flow for Java

realisation that one has to specify not only their initial-final state behaviour, but also their interaction at intermediate points.” [de Roever et al., 2001]. Devising a minimal programing language is the key to verifiable pro- grams, as frequently advocated by Hoare [1981]: “You include only those features which you know to be needed for every single application of the lan- guage [. . . ]. Then extensions can be specially designed where necessary [. . . ].” In this thesis, we consider a simple concurrent imperative language, that we call deterministic While-Release-Fork (dWRF). It extends the sequential lan- guage presented by Beckert and Bruns [2013] with interleavings and dynamic thread creation. It is ‘Java-like’ in the sense that it uses both local and global variables (aka. fields) and that an arbitrary number of sequential program fragments 1 can be executed concurrently. dWRF distinguishes between local variables with atomic assignments and global variables with assignments inducing (local) state transitions. The rationale behind this is that, in a concurrent setting, only global memory can be observed by the environment. Expressions do not have side effects. New threads can be spawned in a simple fork statement, that includes the program of the thread to create, but does not have parameters. Synchronization is not considered at the moment and will be left to future work. We introduce the syntax of dWRF in Sect. 3.2. Other Java features such as objects, arrays, types, or exceptions are not of relevance to our discourse. These features are largely orthogonal to each other (cf., e.g., [Stärk et al., 2001]) and could be added without invalidating the central results. 2 All such features can be added in principle, but we keep the programing language simple for the presentation in this chapter. In Sect. 7.3, we discuss how to extend dWRF to full Java, which will lead to the development of an extension of the KeY verification system [Ahrendt et al., 2014] to concurrent Java.
Mehr anzeigen

352 Mehr lesen

Deductive Verification of Safety-Critical Java Programs

Deductive Verification of Safety-Critical Java Programs

Concurrency in RT applications gives rise to new phenomena and sources for incorrectness. Accessing shared resources and data can lead to, for instance, deadlocks or starvation of threads. Concurrent read and write accesses to data can lead to race conditions or leave data in an inconsistent state. Verification results obtained for a sequential programs can be rendered incorrect when this program is employed in a concurrent setting with other threads, due to these threads possibly interfering with its execution. Proving non-interference and correctness of concurrent Java programs [Klebanov, 2004, Beckert and Klebanov, 2007] for an arbitrary number of threads is a complex endeavor since every possible (modulo symmetries) intermixing of execution progress of the threads has to be considered. Section 9 elaborates on how the RTSJ memory model can be employed for achieving improved data encapsulation guarantees which eases verification of non-interference. It also proposes proof obligations for
Mehr anzeigen

208 Mehr lesen

Component-based Deductive Verification of Cyber-Physical Systems / submitted by Andreas Müller

Component-based Deductive Verification of Cyber-Physical Systems / submitted by Andreas Müller

Early computers (around the middle of the twentieth century) were primarily used to solve equation systems [ 76 ], but soon application areas grew rapidly in- cluding the use of computers as control loops around physical systems [ 55 ]. Such systems that interface and interact directly with their real-world surroundings through sensors and exhibit physical behavior through actuators are commonly known as cyber-physical systems 1 (alternative terms are hybrid systems 2 or em- bedded systems 3 ). Today, these cyber-physical systems (CPS) are pervasively embedded into our lives and increasingly act in close proximity to as well as with direct impact on humans. For example, cars equipped with adaptive cruise control form a typical CPS [ 63 ], which is responsible for controlling accelera- tion/braking on the basis of distance sensors: setting the acceleration of a car is the cyber part while the car and its motion form the physical part. Further prominent examples can be found in many safety-critical areas where lives are at stake, such as in factory automation [ 85 ], medical equipment [ 61 ], aviation [ 120 ], automotive [ 45 ], railway industries [ 102 ], robotics [ 71 ], and power plants [ 116 ]. Because of their safety-criticality, we have to ensure correctness properties, such as safety, i. e., nothing bad will ever happen [ 57 ].
Mehr anzeigen

214 Mehr lesen

Abstract state machines: verification problems and computational power

Abstract state machines: verification problems and computational power

It is rather straightforward to ask for the possibility to weaken the restrictions on the ASMs if we do not aim automatic verification but concepts supporting verification, debugging and testing. One such possibility is the concept of slicing ASMS which we introduce in the third part of this work. The idea is analogous to the one of program slicing aiming to extract statements from a program that are relevant for its behavior at a given point of interest. These statements form again a syntactically correct program called a slice. Previous work has focused on programming languages that differ substantially from ASMs. Although the concept of program slicing does not directly extend to ASMs, it is possible to find an analogous concept for ASMs. We present such an approach. In spite of the fact that a minimal slice is not computable in the general case, we prove that a minimal slice is computable for guarded ASMs. This basic result can be extended in several ways. We present some extensions to larger classes of ASMs and other variants for the notion of slicing.
Mehr anzeigen

193 Mehr lesen

A verification analysis of power quality and energy yield of a large scale PV rooftop

A verification analysis of power quality and energy yield of a large scale PV rooftop

a b s t r a c t The power quality and energy yield of a large scale PV rooftop power plant in Samut Songkhram province are analyzed and presented in this paper. The power quality is examined and analyzed from the measured data to comply with the Provincial Electricity Authority (PEA) standard in Thailand. The measured parameters used in this study are as follows: the RMS Voltage, Frequency, Total Voltage Harmonic Distortion (THDv), and Voltage ripple. Certain parameters of measured data are used to calculate the distributed power yield and then compared with the Homer program simulation respectively. The investigated PV rooftop system has the installed capacity of 987.84 kWp. From the monitoring results, it found that the highest power yield was 778.125 kW while the simulation result was 783 kW. Moreover, based on the PEA standard EN 50160 with the cumulative percentile at 95% for PV rooftop power plant, the measured data showed that the power quality of this power plant passed the PEA regulations for its distribution network connecting system.
Mehr anzeigen

8 Mehr lesen

Symbolic execution and program synthesis : a general methodology for software verification

Symbolic execution and program synthesis : a general methodology for software verification

One of the verification techniques being most commonly used is symbolic execution. In fact, this technique is so common that it is known under many different names and one can argue a lot about the nuances telling each notion apart from the other. In this thesis, we will not try to join the debate and will not separate the notions symbolic execution [Kin76], abstract interpretation [CC77], supercompilation [Tur86], and partial evaluation [LS91; Fut99]. Our understanding of symbolic execution is just the abstraction from concrete to symbolic values representing sets of concrete values. When executing a program, one hence has to adapt its semantics to work on program states containing such symbolic values. While this verification technique is very powerful and flexible, it is also computationally complex. The crux in a successful application of symbolic execution is the usage of the right abstraction depth, i.e., defining the right abstract domain. If this domain is too coarse, symbolic execution loses too much power such that most programs cannot be verified anymore. If the abstraction is too fine-grained, the computational complexity becomes prohibitive.
Mehr anzeigen

454 Mehr lesen

An improved rule for while loops in deductive program verification

An improved rule for while loops in deductive program verification

An important advantage of using modifier sets is that usually a loop only changes few locations and only these locations must be put in a modifier set. On the other hand, using the traditional rule, all locations that do not change and whose value is of importance have to be included in the invariant and, typically, the number of locations that are not changed by the loop is much bigger than the number of locations that are actually changed. Of course, in general not everything that remains unchanged is needed to establish the post-condition in the third premiss. But when applying the invariant rule it is often not obvious what information must be preserved, in particular if the loop is followed by a non-trivial program. That can lead to repeated failed attempts to find the right invariant that allows to complete the proof. Whereas, to figure out the locations that are possibly changed by the loop, it is usually enough to look at the small piece of code in the loop body.
Mehr anzeigen

15 Mehr lesen

Compiler verification in the context of pervasive system verification

Compiler verification in the context of pervasive system verification

In [ ORW95 ], the authors describe a verified compiler for PreScheme, the implementation language for the vlisp run-time system. The compiler as well as the proof were divided into three parts: a front end which translates source text into a core language, a syntax-directed compiler which translates the core language into a combinator-based tree-manipulation language, and a linearizer which translates combinator code into the target language. The work has not been formalized in a theorem prover. However, the authors believe that a simplification-based theorem prover could automatically handle most cases of the syntax-directed translation while the back end proofs may be amenable to mechanization by interactive theorem provers. During tests of a compiler implementation which was manually derived from the specification, the authors found bugs in the assembly code sequences generated for the so-called stored-program machine instructions. These errors were below the grain of the presented proofs (e.g., registers not being saved across routine calls). According to the authors, extending the proof to reach this level – which is obviously required in our scenario – would require an extremely detailed model of the behavior of the machine and operating system.
Mehr anzeigen

294 Mehr lesen

Program-level Specification and Deductive Verification of Security Properties

Program-level Specification and Deductive Verification of Security Properties

The security requirement is expressed by a method contract. The contract states that the attribute m_result as well as its content depends at most on the value of the pa- rameter x (and therefore not on the value of secret). Though the method generates new objects, the method contract does not use the keyword \new_objects. This is possible, because the verification problem is split into two parts: the generation of the new array object and the loop. In the first part as well as in each loop iteration only one new object is created. This makes it easy to find an isomorphism for the newly created objects of each part. Further, by the compositionality result of Section 6.2, we may as- sume at the end of each part that identical objects have been created. Because this holds in particular after the execution of the loop, the method contract does not need to list the new objects again, but can check for object identity instead. Intuitively, each part checks for a compatible extension of the isomorphism of the previous part and there- fore we do not need to check for the existence of an isomorphism for the composition of those parts again. This fact simplifies the verification of object-sensitive noninterference considerably.
Mehr anzeigen

203 Mehr lesen

Researching Gangs: How to Reach Hard-to-Reach Populations and Negotiate Tricky Issues in the Field

Researching Gangs: How to Reach Hard-to-Reach Populations and Negotiate Tricky Issues in the Field

approach instead of a theory-testing study because it bares the power to detect new thoughts, probable causes and underlying mechanisms that have been overlooked so far (CORBIN & STRAUSS, 1990; GLASER & STRAUSS, 1967). [2] "How did you do that?" is usually the first question I hear about my research. The aim of sharing my experiences is to encourage young researchers to re-evaluate the (perceived) limitations of conducting field research themselves in order to gain new and relevant insights into social phenomena while ensuring their own personal safety. Conducting qualitative research with hard-to-reach populations can, depending on the type of population, pose an immense security risk. The dangerous nature of the task is linked to either the high-risk settings or the actors themselves (e.g., criminals, combatants, rebels, terrorists, or members of other violent collectives). Anthropologists, ethnographers, sociologists, and political scientists have all experienced the challenges of conducting qualitative research and the related hazards. The resulting publications have contributed to a growing body of literature, which covers "dangerous fieldwork" (LEE, 1995; NILAN, 2002), "danger in the field" (LEE-TREWEEK & LINKOGLE, 2000a), and "physical dangers to fieldworkers" (BELOUSOV et al., 2007). Research on communities in high-crime areas or war zones entails the threat of physical danger to qualitative researchers; however, it can provide valuable data on the social lives of people in communities which have few points of contact to the outer world. Researchers undertaking such work face physical, emotional, ethical, and professional
Mehr anzeigen

25 Mehr lesen

REACH und Kunststoffrecycling

REACH und Kunststoffrecycling

Bis zur Umsetzung EU-weiter Regelungen zum Ende der Abfalleigenschaft für das Kunststoffrecycling gelten die bisherigen abfallrechtlichen Bestimmungen „vor-Ort“. Diese wurden allerdings vielfach nicht auf die Ebene einzelner Prozessschritte „heruntergebrochen“. Wenn ein Kunststoffrecycler Klarheit über die sachgerechte Umsetzung der neuen chemikalienrechtlichen Anforderungen haben möchte, kann es sinnvoll sein, direkt mit der zuständigen Abfallbehörde zu klären, wo die Abfalleigen- schaft genau endet (z.B. bereits mit dem Mahlgut – nach REACH ein Stoff oder Ge- misch - oder erst bei einer nachgeschalteten Herstellung von Profilmaterialien o.ä. – nach REACH ein Erzeugnis).
Mehr anzeigen

50 Mehr lesen

Extending the marketing concept

Extending the marketing concept

Conclusions The marketing concept has evolved around the firm’s ability to respond to the needs of its environment, particularly its customers. However, the SMP literature offers a limited view of how marketing facilitates that response. Its focus is upon managing change but it fails to identify why marketing is qualified to lead such initiatives, aside from the insight it gives into customer needs. A gap exists in the marketing literature about developing marketing resources that will enable the firm to implement new customer-led marketing strategies either more effectively or efficiently than competitors. These are necessary conditions for the creation of sustained competitive advantage. The authors believe that planning marketing resource development is more suited to marketing’s traditional skills and role within the firm than to directing firm-wide change programmes. Better knowledge with regards regenerating these resources will enable marketing to make a stronger contribution to the firm’s ability to respond to its market insight. We identify four initial research areas for marketing: valuing resources, knowledge management, alliance management, and reputation management. If Levitt taught marketing to focus more upon customer need rather than products, Resource Based Theory can help marketing focus on profitable exploitation of customer need rather than customer needs alone.
Mehr anzeigen

15 Mehr lesen

Reach am Arbeitsplatz

Reach am Arbeitsplatz

Seit der Veröffentlichung der Grundzüge der Reform im Weißbuch über die zukünftige Chemikalienpolitik der Gemeinschaft im Jahre 2001 hat die Debatte um REACH zu um- fangreicher Polemik Anlass gegeben. Es herrscht ein breiter Konsens über die Notwendig- keit einer besseren Kontrolle der Risiken chemischer Stoffe, die sich auf dem europäischen Markt und auf dem Weltmarkt im Verkehr befinden. Dies wird auch in internationalen Be- schlüssen, z.B. beim Gipfel von Johannesburg, zum Ausdruck gebracht. Gleichwohl be- haupten einige Unternehmen und mit ihnen einige Regierungen, dass die Umsetzung die- ser neuen Gesetzgebung eine spürbare Kostenerhöhung der chemischen Stoffe in Europa nach sich ziehe. Dies führe dazu, dass eine große Anzahl bedeutender Stoffe vom Ge- meinschaftsmarkt genommen würde und dass es in den betroffenen Wirtschaftszweigen zu einem massiven Abbau der Arbeitsplätze käme.
Mehr anzeigen

74 Mehr lesen

Extending the knowledge base of foresight

Extending the knowledge base of foresight

(Yoon et al., 2008a) or keyword-extraction techniques (Yoon et al., 2008b; Lin et al., 2008; Lee et al., 2008). For example, SAO structures originate from TRIZ (Altshuller, 1984), the Russian acronym for “Theory of inventive problem solving”. SAO is focused on key concepts instead of keywords, where AO describes the problem description and S stands for the solution. As data source, patent data is used in most cases (e.g. Choi et al., 2013; Lee et al., 2008) but also other sources such as product manuals (Yoon et al., 2008a). For example, the patent data is used to analyze trends and identify relations between product and technology (Lee et al., 2008). Text mining is used preparatory to the roadmapping process to explore and structure the thematic field (Yoon et al., 2008b; Kostoff et al., 2004) or, for example, to develop a specific tech monitoring framework (Lin et al., 2008). In comparison, other results are technically far-reaching and the roadmap is constructed semi-automatically (Choi et al., 2013; Suh and Park, 2009). Even in these cases, dedicated expert involvement is still necessary, especially for controlling and selecting the keywords or search terms (Yoon et al., 2008a; Lee et al., 2008; Suh and Park, 2009). Text mining is rarely used in parallel to roadmapping as a simultaneous supporting element (Yoon et al., 2008a) so that the experts assist the data analysis (Lee et al., 2008) and not the other way round — data analysis supporting expert-based roadmapping.
Mehr anzeigen

112 Mehr lesen

Transfers and implementations of a Swedish manual work program: sloyd and entrepreneurial power

Transfers and implementations of a Swedish manual work program: sloyd and entrepreneurial power

In other cases, the students themselves published and were reported. In 1889, New Zealand hosted a World Fair [the New Zealand and South Seas Exhibition] in Otago which was visited by over half a million visitors over six months. The Fair had four special courts running parallel to the main avenue, one of which was the Education Court. In its first bay, the Kindergarten bay, there was a Sloyd bench for kindergarten carpentry classes. The New Zealand Herald described this «Swedish invention and useful apparatus», which alongside other educational exhibits, would prove to other colonial visitors that New Zealand education «does not lag behind in the progressive march of nations» (NH, 1889). A sister paper, the Otago Witness, followed its statement that a great deal of interest was taken in the Sloyd system to explain at length its ideas and practices (OW, 1889). Two years later, another New Zealand paper, the Bush Advocate, referring to a recent edition of the New Zealand Times, discussed the Sloyd system at length, and referenced two teachers in London – Nystrom and Chapman. In particular, they focus on their approach: «[They] do not advocate the adoption of the Swedish system but insist on the necessity of having a method of manual training adapted to English habits and English forms and customs.» (BA, 1891, p. 2)
Mehr anzeigen

21 Mehr lesen

An adaptive deductive planning system

An adaptive deductive planning system

Improving the usability of computer systems is an important research goal of human- computer interaction (cf. Ben93]). Adaptive systems with their design variety help to approach this goal. In the case of a system adapting to a human, user modelling plays an important role (cf. McT93, Kob93]). The system should be able to adapt to the individual characteristics and needs of its users. Human-computer interaction systems of that kind where a planner is an integrated part can be members of the following classes: intelligent help system (cf. Bre90], Tat92], BBD + 93]), intelligent assistant system (cf. GJ90], Boy91], SC92]), or intelligent tutoring systems (cf. Nwa91], EC93]). In these systems the planner's results have to be measured with respect to the user's ongoing task, his knowledge about the domain including his preferences, and his experience with the application system (cf. Win90], Kok91]). A plan is then optimal if it is both well adapted to the user's requirements and as short as possible according to the planner's ability. With the integration of general optimization techniques in a planning system the complex- ity problem of planning must be considered (cf. BN93]). Often, only approximate solutions are realistic, e.g. plan merging (cf. YNH92]), transformational plan synthesis (cf. Lin90]), and localization (cf. Lan90]). It is desirable to place at the planning system engineer's disposal, a pool of optimization techniques from which the most promising candidate for the current application can then be taken.
Mehr anzeigen

22 Mehr lesen

Extending Financialisation and Increasing Fragility of the Financial System

Extending Financialisation and Increasing Fragility of the Financial System

closely linked to centennial changes in the position, role and importance of the macroeconomic sectors of the national economies – primary, secondary (industrial) and tertiary (services) sectors (Clark, 1957; Rostow, 1960; Lapavitsas, 2009a), laying the stress in our days on the services sector, including financial services in the developed economies. Being part of such a general historical process, the development of financialisation in the last decades has been stimulated by several factors, the most significant ones being the neoliberal decentralisation (since the 1980s, in the financial and commercial sectors), the innovation and utilisation of new financial instruments and mechanisms, the increasing indebtedness of the population (households) and companies, as well as the rising debt of the companies and public debt, the emergence and development of financial institutions and the main financial agents, as well as the demand for financial means (investments and liquidities) (Onaran et al., 2010).
Mehr anzeigen

48 Mehr lesen

Show all 10000 documents...