Repeating the past experimental and empirical methods in system and software security [Elektronische Ressource] / Stephan Neuhaus
153 pages
English

Repeating the past experimental and empirical methods in system and software security [Elektronische Ressource] / Stephan Neuhaus

Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
153 pages
English
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres

Description

Repeating the PastExperimental and Empirical Methodsin System and Software SecurityStephan NeuhausDissertation zur Erlangung des Grades desDoktors der Ingenieurswissenschaftender Naturwissenschaftlich-Technischen Fakultäten der Universität des SaarlandesSaarbrücken, 2007© 2007, 2008 by Stephan NeuhausAll rights reserved. Published 2008Printed January 24, 2008 from SVN revision 143Day of Defense February 6, 2008Dean Prof. Dr. Thorsten HerfetHead of the Examination Board Prof. Dr. Reinhard WilhelmMembers of the Ex Board Prof. Dr. Andreas Zeller,Prof. Dr. Michael Backes, Philipp LucasMain Typeface URW Garamond No. 8 10/12Additional Typefaces Adobe Times, Adobe Courier,Computer Modern Math SymbolsATypesetting System PDF-LT XEPrinted in Germany4 3 2 2 3 4 5To Stefanie, with love.It’s like déjà vu all over again.— Yogi BerraContentsAbstract xiPreface xv1 Introduction 12 Causes And Effects 32.1 Analysis of Break-Ins . . . . . . . . . . . . . . . . . . . . . . . . . . 32.1.1 Forensic Analysis of Break-ins . . . . . . . . . . . . . . . . . 32.1.2 Cause-Finding as Minimization . . . . . . . . . . . . . . . . 52.1.3 Philosophical Justification . . . . . . . . . . . . . . . . . . . 62.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2.1 Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2.2 Research in Intrusion Analysis . . . . . . . . . . . . . . . . . 112.2.3 Research in Intrusion Prevention . . . . .

Sujets

Informations

Publié par
Publié le 01 janvier 2008
Nombre de lectures 4
Langue English
Poids de l'ouvrage 17 Mo

Extrait

Repeating the Past
Experimental and Empirical Methods
in System and Software Security
Stephan Neuhaus
Dissertation zur Erlangung des Grades des
Doktors der Ingenieurswissenschaften
der Naturwissenschaftlich-Technischen Fakultäten der Universität des Saarlandes
Saarbrücken, 2007© 2007, 2008 by Stephan Neuhaus
All rights reserved. Published 2008
Printed January 24, 2008 from SVN revision 143
Day of Defense February 6, 2008
Dean Prof. Dr. Thorsten Herfet
Head of the Examination Board Prof. Dr. Reinhard Wilhelm
Members of the Ex Board Prof. Dr. Andreas Zeller,
Prof. Dr. Michael Backes, Philipp Lucas
Main Typeface URW Garamond No. 8 10/12
Additional Typefaces Adobe Times, Adobe Courier,
Computer Modern Math Symbols
ATypesetting System PDF-LT XE
Printed in Germany
4 3 2 2 3 4 5To Stefanie, with love.It’s like déjà vu all over again.
— Yogi BerraContents
Abstract xi
Preface xv
1 Introduction 1
2 Causes And Effects 3
2.1 Analysis of Break-Ins . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.1.1 Forensic Analysis of Break-ins . . . . . . . . . . . . . . . . . 3
2.1.2 Cause-Finding as Minimization . . . . . . . . . . . . . . . . 5
2.1.3 Philosophical Justification . . . . . . . . . . . . . . . . . . . 6
2.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2.1 Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2.2 Research in Intrusion Analysis . . . . . . . . . . . . . . . . . 11
2.2.3 Research in Intrusion Prevention . . . . . . . . . . . . . . . 12
2.2.4 Verification and Formal Methods . . . . . . . . . . . . . . . 13
3 Minimization: State of the Art 15
3.1 Preliminary Definitions . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2 The Minimization Problem . . . . . . . . . . . . . . . . . . . . . . 17
3.3 Naïve Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.4 Delta Debugging . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4 Minimization For Intrusion Analysis 27
4.1 Extended Naïve Algorithm . . . . . . . . . . . . . . . . . . . . . . . 27
4.2 Extreme Minimization . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.3 Binary . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5 Malfor 33
5.1 Processes and Replay . . . . . . . . . . . . . . . . . . . . . . . . . . 33
5.1.1 Preliminary Definitions . . . . . . . . . . . . . . . . . . . . 33
5.1.2 Replay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.2 System Call Interposition . . . . . . . . . . . . . . . . . . . . . . . 36
5.2.1 System Calls . . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.2.2 Interposition . . . . . . . . . . . . . . . . . . . . . . . . . . 38
5.2.3 Solipsy Architecture . . . . . . . . . . . . . . . . . . . . . . 38
5.2.4 State Mappings . . . . . . . . . . . . . . . . . . . . . . . . . 39
5.3 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.3.1 Proof of Concept . . . . . . . . . . . . . . . . . . . . . . . . 40
viiviii CONTENTS
5.3.2 A Real-World Attack . . . . . . . . . . . . . . . . . . . . . . 43
5.3.3 Replacing Delta Debugging by Binary Minimization . . . . . 48
5.3.4 Malfor Performance . . . . . . . . . . . . . . . . . . . . . . 48
5.3.5 Performance of Capturing . . . . . . . . . . . . . . . . . . . 49
5.3.6 Deployability . . . . . . . . . . . . . . . . . . . . . . . . . . 49
5.3.7 Fooling Malfor . . . . . . . . . . . . . . . . . . . . . . . . . 50
5.4 Visualizing Malfor . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
5.5 Beyond Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
5.5.1 Input Reassembly . . . . . . . . . . . . . . . . . . . . . . . . 52
5.5.2 Signature Extraction . . . . . . . . . . . . . . . . . . . . . . 54
5.5.3e Generalization . . . . . . . . . . . . . . . . . . . . 56
5.5.4 Preliminary Results . . . . . . . . . . . . . . . . . . . . . . 56
5.6 Controlling Concurrency . . . . . . . . . . . . . . . . . . . . . . . 59
5.6.1 System Calls and Traces . . . . . . . . . . . . . . . . . . . . 60
5.6.2 Capturing Traces . . . . . . . . . . . . . . . . . . . . . . . . 61
5.6.3 Plausible Traces . . . . . . . . . . . . . . . . . . . . . . . . . 62
5.6.4 Causality in Distributed Systems . . . . . . . . . . . . . . . 63
5.6.5-Preserving Replay . . . . . . . . . . . . . . . . . . 67
5.7 Lessons Learned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
5.7.1 State Changes . . . . . . . . . . . . . . . . . . . . . . . . . . 68
5.7.2 State Mappings . . . . . . . . . . . . . . . . . . . . . . . . . 69
5.7.3 Why Is Replay So Hard To get Right? . . . . . . . . . . . . . 69
5.7.4 Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
6 Forecasting Vulnerable Components 75
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
6.2 Components and Vulnerabilities . . . . . . . . . . . . . . . . . . . . 77
6.2.1 Components . . . . . . . . . . . . . . . . . . . . . . . . . . 77
6.2.2 Mapping Vulnerabilities to Components . . . . . . . . . . . 78
6.2.3 Vulnerable Components in Mozilla . . . . . . . . . . . . . . 78
6.3 How Imports and Function Calls Matter . . . . . . . . . . . . . . . 80
6.3.1 Imports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
6.3.2 Function Calls . . . . . . . . . . . . . . . . . . . . . . . . . 82
6.3.3 Mapping Vulnerabilities to Features . . . . . . . . . . . . . . 83
6.3.4 Features in Mozilla . . . . . . . . . . . . . . . . . . . . . . . 84
6.4 Predicting Vulnerabilities From Features . . . . . . . . . . . . . . . 85
6.4.1 Validation Setup . . . . . . . . . . . . . . . . . . . . . . . . 85
6.4.2 Evaluating Classification . . . . . . . . . . . . . . . . . . . . 86
6.4.3 Ev Ranking . . . . . . . . . . . . . . . . . . . . . . 87
6.4.4 Prediction using Support Vector Machines . . . . . . . . . . 89
6.5 Case Study: Mozilla . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
6.5.1 Data Collection . . . . . . . . . . . . . . . . . . . . . . . . . 90
6.5.2 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . 91
6.5.3 Ranking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
6.5.4 A Ranking Example . . . . . . . . . . . . . . . . . . . . . . 93
6.5.5 Forecasting Vulnerable Components . . . . . . . . . . . . . 93
6.5.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
6.5.7 Threats to Validity . . . . . . . . . . . . . . . . . . . . . . . 95
6.6 Causes of Vulnerabilities . . . . . . . . . . . . . . . . . . . . . . . . 96
6.6.1 Clusters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97CONTENTS ix
6.6.2 Singletons . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
6.7 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
6.8 Beyond Imports and SVMs . . . . . . . . . . . . . . . . . . . . . . . 101
6.8.1 Learning Patterns . . . . . . . . . . . . . . . . . . . . . . . . 101
6.8.2 Other Future Work . . . . . . . . . . . . . . . . . . . . . . 103
7 Conclusions 107
A Philosophy 111
A.1 Rationalist Philosophy . . . . . . . . . . . . . . . . . . . . . . . . . 111
A.2 Empiricisty . . . . . . . . . . . . . . . . . . . . . . . . . 113
A.3 Modern Philosophy of Science . . . . . . . . . . . . . . . . . . . . . 113
Notation 115
Glossary 117
Bibliography 119
Index 133Abstract
I propose a new method of analyzing intrusions: instead of analyzing evidence and
deducing what must have happened, I find the intrusion-causing circumstances by a
series of automatic experiments. I first capture process’s system calls, and when an
intrusion has been detected, I use these system calls to replay some of the captured
processes in order to find the intrusion-causing processes—the cause-effect chain that
led to the intrusion. I extend this approach to find also the inputs to those processes
that cause the intrusion—the attack signature.
Intrusion analysis is a minimization problem—how to find a minimal set of cir-
cumstances that makes the intrusion happen. I develop several efficient minimiza-
tion algorithms and show their theoretical properties, such as worst-case running
times, as well as empirical evidence for a comparison of average running times.
Our evaluations show that the approach is correct and practical; it finds the 3
processes out of 32 that are responsible for a proof-of-concept attack in about 5 min-
utes, and it finds the 72 out of 168 processes in a large, complicated, and difficult
to detect multi-stage attack involving Apache and suidperl in about 2.5 hours. I also
extract attack signatures in proof-of-concept attacks in reasonable time.
I have also considered the problem of predicting before deployment which com-
ponents in a software system are most likely to contain vulnerabilities. I present
empirical evidence that vulnerabilities are connected to a component’s imports. In
a case study on Mozilla, I correctly predicted one half of all vulnerable components,
while more than two thirds of our predictions were correct.
Zusammenfassung
Ich stelle eine neue Methode der Einbruchsanalyse vor: Anstatt Spuren zu analysie-
ren und daraus den Ereignisverlauf zu erschließen, finde ich die einbruchsverursa-
chenden Umstände durch automatische Experimente. Zunächst zeichne ich die Sy-

  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents