ࡱ>  0g(5   (Fhttp://www.cl.cam.ac.uk/users/rja14Fhttp://www.cl.cam.ac.uk/users/rja14/ 0DTimes New Roman+A 0%0DMonotype Sorts+A 0%0 DBook Antiquas+A 0%00DSymboltiquas+A 0%0@DHelveticauas+A 0%0 "PDCourier News+A 0%01`DMS Minchows+A 0%01 A .@  @@``  @n?" dd@  @@`` B:x8u!         CPJ+(SJR(!s AA1? @g4BdBd%0Eppp@  <4!d!d<T0x<4BdBd<T0x4ʚ;9ʚ;<4dddd<U0(l0___PPT10 00,___PPT9/ 0# 4 @? %O =Z#Computer Security in the Real World&$#((Butler Lampson MicrosoftSecurity: The Goal hComputers are as secure as real world systems, and people believe it. This is hard because: Computers can do a lot of damage fast. There are many places for things to go wrong. Networks enable Anonymous attacks from anywhere Automated infection Hostile code and hostile hosts People don t trust new things.f]fS D((fS Real-World SecurityPIt s about value, locks, and police. Locks good enough that bad guys don t break in very often. Police and courts good enough that bad guys that do break in get caught and punished often enough. Less interference with daily life than value of loss. Security is expensive buy only what you need.H&  -. -&.,#Elements of Security Policy: Specifying security What is it supposed to do? Mechanism: Implementing security How does it do it? Assurance: Correctness of security Does it really work?   '    $ Dangers$ vVandalism or sabotage that damages information disrupts service Theft of money Theft of information Loss of privacy B & 5 &5VulnerabilitiesBad (buggy or hostile) programs Bad (careless or hostile) people giving instructions to good programs Bad guy interfering with communicationsB  @Defensive strategiesKeep everybody out Isolation Keep the bad guy out Code signing, firewalls Let him in, but keep him from doing damage Sandboxing, access control Catch him and prosecute him Auditing, police    +     +  The Access Control Model*Guards control access to valued resources.+@p+ 8Mechanisms The Gold StandardAuthenticating principals Mainly people, but also channels, servers, programs Authorizing access. Usually for groups of principals Auditing Assurance Trusted computing base  4 - ! -    4  !  -$Assurance: Making Security WorkTrusted computing base Limit what has to work to ensure security Ideally, TCB is small and simple Includes hardware and software Also includes configuration, usually overlooked What software has privileges Database of users, passwords, privileges, groups Network information (trusted hosts, & ) Access controls on system resources . . . The unavoidable price of reliability is simplicity. Hoare*!O: 2*!O 3.%Assurance: ConfigurationlUsers keep it simple At most three levels: self, friends, others Three places to put objects Everything else done automatically with policies Administrators keep it simple Work by defining policies. Examples: Each user has a private home folder Each user belongs to one workgroup with a private folder System folders contain vendor-approved releases All executable programs are signed by a trusted party Today s systems don t support this,1%#,1% # Assurance: Defense in DepthNetwork, with a firewall Operating system, with sandboxing Basic OS (such as NT) Higher-level OS (such as Java) Application that checks authorization directly All need authentication<; 5 H ;5H BWhy We Don t Have  Real SecurityA. People don t buy it: Danger is small, so it s OK to buy features instead. Security is expensive. Configuring security is a lot of work. Secure systems do less because they re older. Security is a pain. It stops you from doing things. Users have to authenticate themselves. B. Systems are complicated, so they have bugs. L U  -G 0 LUG .#"Standard Operating System SecurityrAssume secure channel from user (without proof) Authenticate user by local password Assign local user and group SIDs Access control by ACLs: lists of SIDs and permissions Reference monitor is the OS, or any RPC target Domains: same, but authenticate by RPC to controller Web servers: same, but simplified Establish secure channel with SSL Authenticate user by local password (or certificate) ACL on right to enter, or on user s private stateT! 6/ W T!&/'    1'End-to-End SecurityAuthenticate secure channels Work uniformly between organizations Microsoft can securely accept Intel s authentication Groups can have members from different organizations Delegate authority to groups or systems Audit all security decisions6BjEBjE4)End-to-End exampleAlice is at Intel, working on Atom, a joint Intel-Microsoft project Alice connects to Spectra, Atom s web page, with SSL Chain of responsibility: KSSL Ktemp KAlice Alice@Intel Atom@Microsoft r/w Spectray B0n  HHH  PrincipalsRAuthentication: Who sent a message? Authorization: Who is trusted? Principal  abstraction of  who : People Alice, Bob Services microsoft.com, Exchange Groups UW-CS, MS-Employees Secure channels key #678532E89A7692F, console Principals say things:  Read file foo  Alice s key is #678532E89A7692F ^$  " 2|  2 C"          $ Speaks For Principal A speaks for B: A T B Meaning: if A says something in set T, B says it too. Thus A is stronger than B, or responsible for B, about T Examples Alice Atom group of people Server-1 Spectra group of servers Key #7438 Alice key for Alice Delegation rule: If A says  B A then B A We trust A to delegate its own authority. Why should A delegate to B? Needs case by case analysis. Need a secure channel from A for  A says Easy if A is a key.! 6 9  i .* 9 *   I                                     6+Authenticating ChannelsChain of responsibility: KSSL Ktemp KAlice Alice@Intel & Ktemp says KAlice says (SSL setup) (via smart card)90n0  JJJ JBJ$B&Authenticating Names: SDSI/SPKI A name is in some name space, defined by a key The key speaks for any name in its name space KIntel KIntel / Alice (which is just Alice@Intel) KIntel says & Ktemp KAlice Alice@Intel & / . 8 0n#0/          HH Authenticating GroupsA group is a principal; its members speak for it Alice@Intel Atom@Microsoft Bob@Microsoft Atom@Microsoft & Evidence for groups: Just like names and keys. & KAlice Alice@Intel Atom@Microsoft r/w & z1> //0 1 /J Authorization with ACLsView a resource object O as a principal An ACL entry for P means P can speak for O Permissions limit the set of things P can say for O If Spectra s ACL says Atom can r/w, that means Spectra says & Alice@Intel Atom@Microsoft r/w Spectra( + 4 /  , $     7,End-to-End Example: Summary Request on SSL channel: KSSL says  read Spectra Chain of responsibility: KSSL Ktemp KAlice Alice@Intel Atom@Microsoft r/w Spectra KC0 nLHHH ( Compatibility with Local OS?4(1) Put network principals on OS ACLs (2) Let network principal speak for local one Alice@Intel Alice@microsoft Use network authentication replacing local or domain authentication Users and ACLs stay the same (3) Assign SIDs to network principals Do this automatically Use network authentication as beforeT:)&;T )&; Authenticating Systems A digest X can authenticate a program Word: KMicrosoft says  If image I has digest X then I is Word formally X KMicrosoft / Word A system N can speak for another system Word: KMicrosoft says N KMicrosoft / Word The first cert makes N want to run I if N likes Word, and it makes N assert the running I is Word The second cert lets N convince others that N is authorized to run Word>,Z . '0x        !AuditingChecking access: Given a request KAlice says  read Spectra an ACL Atom may r/w Spectra Check KAlice speaks KAlice Atom for Atom rights suffice r/w read Auditing: Each step is justified by A signed statement (certificate), or A delgation rule % 6 H HH 6 "Implement: Tools and Assurance,Gold standard Authentication Who said it? Authorization Who is trusted? Auditing What happened? End-to-end authorization Principals: keys, names, groups Speaks for: delegation, chain of responsibility Assurance: Trusted computing base Keep it small and simple. Include configuration, not just code. pS  P # p@0Z P!@ ' ReferencesWhy  real security is hard Ross Anderson: www.cl.cam.ac.uk/users/rja14 Bruce Schneier, Secrets and Lies Distributed system security Lampson et al. TOCS 10, 4 (Nov. 1992) Wobber et al. TOCS 12, 1 (Feb. 1994) Simple Distributed Security Infrastructure (SDSI) theory.lcs.mit.edu/~cis/sdsi.html Simple Public Key Infrastructure (SPKI) www.cis.ohio-state.edu/htbin/rfc/rfc2693.html BM K 2" (/ 2!( -L+!XQ0+GQ0hiQ0#/P  ` ̙33` 3` 3333f` 999MMM` f` f3` 3>?" dd@,?lKd@   d @uA`  d n?" ddV %%KKppPP    @ ` ` p@@  \(     `dgֳgֳ ?``  @*    `gֳgֳ ?`   B*    `Tgֳgֳ ?`   B* pB  Hp?@@  Zgֳgֳ ?@  T Click to edit Master title style! !:  Tgֳgֳ ?0  RClick to edit Master text styles Second Level Third Level Fourth Level Fifth Level!     S^  6?9{^   6?    Zlgֳgֳ?QG B* H  0޽h? ? X(=^ sidebarcC0 PS(    T@jJjJ ?u#   r*   V%%KKpp  TСjJjJ ? 5#  t*   V%%KKpp  Z|jJjJ ?u   r*   V%%KKpp  Z,jJjJ ? 5  t*   V%%KKpp;  TXm m  ? L  SClick to edit Master notes styles Second Level Third Level Fourth Level Fifth Level"     Tp  01 ?OE  H  0j? ? a(80___PPT10.Ud8  X(     T[jJjJ ?u#   b*  V%%KKpp  T(jJjJ ? 5#  d*  V%%KKpp  Zl6jJjJ ?u   b*  V%%KKpp  Z9jJjJ ? 5  d*  V%%KKppH  0j? ? a(80___PPT10.Ud5I0 % 0~(     fEgֳgֳ ?`@     f+gֳgֳ ?P  "  TXgֳgֳ ?0  (H  0޽h ? X(=^tl___PPT10L+D0' =+Z  `0(  0r 0 S @   l 0 C X0   H 0 0޽h ? X(=^tl___PPT10L+D0' =+`  p8$(  8r 8 S P@   r 8 S ܆0  H 8 0޽h ? X(=^tl___PPT10L+D0' =+   6(   r   S <@  <    S  <0 <  ZH   0޽h ? X(=^|___PPT10\.Zd*:+D0' =+  <](  <r < S <@  < r < S <P  < 1 < T%gֳgֳ ?0 1 integrity availability integrity secrecy secrecyJ lK lK 1H < 0޽h ? X(=^tl___PPT10L+D0' =+`  H$(  Hr H S  "<@  < r H S "<0 < H H 0޽h ? X(=^tl___PPT10L+D0' =+Z  L(  Lr L S x&<@  < l L C )<Pp < H L 0޽h ? X(=^tl___PPT10L+D0' =+R  ~` (  `r ` S u @  < l ` C <  `   F ` S   L ` c $8c  @ ` C h F ` S 8ch  ` Ht0<' ` N Reference (     ` H5<   Kmonitor(    ` BL:<u  FObject$  VR &{%)&  `3     ` BTCDEFTOOTO @`'{%)&<B  ` # 8c&%'%VR 1{%35& `3   ` BTCDEFTOOTO @`3{%35&<B ` # 8c1%3% ` BT?<  CDo $   ` BC<  I operation$    ` B(H<   JResource&  L2 ` c $px @ ` C X F ` S 8cX  ` BM<@  I Principal$   VR {%& `3 X  ` BTCDEFTOOTO @`o{%&<B ` # 8c%o% ` HS<   KGuard*   ` BW< n   IRequest&   ` BX[< x  HSource&  H ` 0޽h ? X(=^tl___PPT10L+D0' =+R  ~P(  Pv P N1?0  v P N1? `v  P N1?@  v P N1? Pl P C d<@  < r P S e<0 < H P 0޽h ? X(=^tl___PPT10L+D0' =+p  $(  r  S l<@  < r  S l< < H  0޽h ? X(=^|___PPT10\.yd0JY+D0' =+p  $(  r  S s<@  < r  S Tt<P < H  0޽h ? X(=^|___PPT10\.yde+D0' =+`  T$( | 0 w Tr T S p{<@  < r T S D|<0 < H T 0޽h ? X(=^tl___PPT10L+D0' =+Z  X(  Xl X C 0<@  < r X S <0 < H X 0޽h ? X(=^tl___PPT10L+D0' =+`   $(  r  S < @  < r  S <` 0 < H  0޽h ? X(=^tl___PPT10L+D0' =+  &0'(  r  S <@  < r  S ė<0 <   Z1?H  0޽h ? X(=^|___PPT10\.|d+D0' =+  @&I$ (  $r $ S p<@  < r $ S D<0 < | &$ T1? [F   I$   ($ 6<($j [  Lsays,( c )$ 6<)$P{   s Spectra ACLL ( cc c *$ 6x<*$> ^ 9 bKSSLB( ck +$ 6<+$`P   Lsays,( c ,$ 6T<,$3 p  Lsays,( c -$ 0<-$"`( l$Alice s smart card,( c .$ s *D.$H   p(Alice s login system,( c /$ s *</$~ P xSpectra web pageL( cc c 0$ 6<0$P s cKtempB( ck 1$ 6<1$0 s dKAliceB( ck 2$ 6 <2$ {  S Alice@Intel, (  c 3$ 6<3$ {  VAtom@Microsoft,( c2 4$ 0<4$ [  Q Microsoft, (  c2 5$ 0<5$   MIntel,( cx 6$ HԔ? U~ Vx 7$ H>? + x 8$ H>?+P +x 9$ H>?{ H x :$ H>? 3  3 x ;$ H>?L{ ~ <$ N1?s U ~ =$ N1? [  ~ >$ N1?# P  ?$ 6x<?$P s dKAliceB( ckR @$  `-1?  R A$ Z1?p  AR B$ ZZ1?H =x mR C$  `v;:G&1?2 a b R D$ Z1?XCR E$ ZZ1?  0 K R F$ Zp1?+ [ R G$ ZY1?R H$ ZL1?p  H $ 0޽h ?  .$/$6$ 0$*$7$ 1$0$8$ 1$2$9$2$3$:$3$/$;$5$<$4$H$=$@$)$>$ X(=^|___PPT10\.dpoT+D0' =+r  Ph6(  hr h S =@  =  h S = =  H h 0޽h ? X(=^tl___PPT10L+D0' =+l  `0(  r  S =@  = ~  C X=  =   H  0޽h ? X(=^tl___PPT10L+D0' =+C  g_p,Q,(  ,x , c $]=@  =  , c $xc= 0   ( f[F s  Q, @ , 6Te=,j K  Lsays,( c , 6i=,Pk   s Spectra ACLL ( cc c , 6 ,> ^ )  bKSSLB( ck , 6p% ,`@   Lsays,( c  , 6l)  ,# p  Lsays,( c  , 0  ,"` (  l$Alice s smart card,( c  , s *  ,H   p(Alice s login system,( c  , s *o= ,~ P  xSpectra web pageL( cc c  , 6u= ,P c  cKtempB( ck , 6z=,0 c  dKAliceB( ck , 6=, k  S Alice@Intel, (  c , 6=, k  VAtom@Microsoft,( c2 , 0=, s K  Q Microsoft, (  c2 , 0=,   MIntel,( cx , HԔ? E ~ F x , H>?  x , H>? P  x , H>?k H x , H>? #  # x , H>?Lk  ~ , N1?c E ~ , N1? K ~ , N1? P  , 6=,P c  dKAliceB( ckR ,  `-1?  R , Z1?p  1 R , ZZ1?H - x ] R  ,  `v;:G&1?2 Q b R !, Z1?X 3 R ", ZZ1? 0 ; R #, Zp1? K R $, ZY1?  R %, ZL1?p F8  R ` N, > `xB K, HD>?" 3" `~R L, N1? R pF8 p` O,`xB ', HD>?3`~R M, N1?pp| P, T1? H , 0޽h ?  , ,, ,,,, ,,,,,,,,, ,,,,,%,, ,,, X(=^|___PPT10\.dpoT+D0' =+   )*s(  l  C `=@  =   S =  =  X[F S  * S   6L=j +  Lsays,( c  62 PK   s Spectra ACLL ( cc c  68 > y^  bKSSLB( ck  6= `   Lsays,( c   6A   p  Lsays,( c   0E  "`( l$Alice s smart card,( c   s *I  H   p(Alice s login system,( c   s *= ~ P xSpectra web pageL( cc c   6|= P C cKtempB( ck  6d=0 C dKAliceB( ck  6= K  S Alice@Intel, (  c  6X= K  VAtom@Microsoft,( c2  0= S +  Q Microsoft, (  c2  0h= c  MIntel,( cx  HԔ? %~ &x  H>? yx  H>? P x  H>?K H x  H>?    x  H>?LK ~  N1?C % ~  N1? + ~  N1? P   6=P C dKAliceB( ckR   `-1?o  R  Z1?p R  ZZ1?H x =R    `v;:G&1?2 1 b a R ! Z1?X R " ZZ1? 0  R # Zp1? + R $ ZY1?oR % ZL1?p TF  R ` & P@xB ' HD>?" 3" `~R ( N1? R p| ) T1?p H  0޽h ?      %  X(=^tl___PPT10L+D0' =+  &I(  r  S  =@  =   S L PPP =  N[F   I   ' 6U 'j X  Lsays,( c ( 6Y (Px   s Spectra ACLL ( cc c ) 6^ )> ^ 6 bKSSLB( ck * 6< *`P   Lsays,( c + 6@+0 p  Lsays,( c , 0@,"`( l$Alice s smart card,( c - s * @-H   p(Alice s login system,( c . s *@.~ P xSpectra web pageL( cc c / 6T@/P p cKtempB( ck 0 6@00 p dKAliceB( ck 1 6@1 x  S Alice@Intel, (  c 2 6X@2 x  VAtom@Microsoft,( c2 3 0#@3 X  Q Microsoft, (  c2 4 0(@4   MIntel,( cx 5 HԔ? R~ Sx 6 H>? ( x 7 H>?(P (x 8 H>?x H x 9 H>? 0  0 x : H>?Lx ~ ; N1?p R ~ < N1? X  ~ = N1? P  > 6/@>P p dKAliceB( ckR ?  `-1?  R @ Z1?p  >R A ZZ1?H :x jR B  `v;:G&1?2 ^ b R C Z1?X@R D ZZ1?  0 H R E Zp1?( X R F ZY1?R G ZL1?p  | H T1?pP H  0޽h ?  -.5 /)6 0/7 0181292.:4;3G<?(= X(=^tl___PPT10L+D0' =+   )Oy( }   S d@   @  n r  S Te@@  @ [F   O   * 6f@*j {  Lsays,( c + 6p +P [ s Spectra ACLL ( cc c , 6lp ,> 9^  bKSSLB( ck - 6u -` p Lsays,( c . 6y . pS  Lsays,( c / 0|} /"`K({ l$Alice s smart card,( c 0 s *D 0H K  p(Alice s login system,( c 1 s *,j 1~ KP~ xSpectra web pageL( cc c 2 6o@2P s  cKtempB( ck 3 6Lt@30s dKAliceB( ck 4 6px@4{  S Alice@Intel, (  c 5 6|@5 {   VAtom@Microsoft,( c2 6 0@6   Q Microsoft, (  c2 7 04@7K #  MIntel,( cx 8 HԔ? ~ x 9 H>?  9x : H>?P x ; H>? H sx < H>?  x = H>?L K~ > N1?  ~ ? N1? ~ @ N1? PN A 64@APs dKAliceB( ckR B  `-1?/_R C Z1?p  R D ZZ1?H x R E  `v;:G&1?2 b ! R F Z1?XR G ZZ1? 0 R H Zp1?  R I ZY1?/_R J ZL1?p { | K T1?  TF  R ` L  xB M HD>?" 3" `~R N N1? R pH  0޽h ?  018 2,9 32: 34;45<51=7>6J?B+@ X(=^tl___PPT10L+D0' =+  &'0 (  0r 0 S L@@  @ r 0 S  @0 @ | 0 T1?[F  p '0  p 0 6X@0j k  Lsays,( c 0 6@0P K  s Spectra ACLL ( cc c 0 6<@0> ) ^  bKSSLB( ck  0 6d@ 0` `  Lsays,( c  0 6@ 0 pC  Lsays,( c  0 0L@ 0"`; (k l$Alice s smart card,( c  0 s *@ 0H ;  p p(Alice s login system,( c  0 s *d@ 0~ ; Pn xSpectra web pageL( cc c 0 6@0P c  cKtempB( ck 0 6h@00c   dKAliceB( ck 0 6@0k  S Alice@Intel, (  c 0 6@0 k   VAtom@Microsoft,( c2 0 0@0   Q Microsoft, (  c2 0 0|@0;   MIntel,( cx 0 HԔ? ~ x 0 H>? ) x 0 H>? P x 0 H>? H c x 0 H>?  x 0 H>?L ; ~ 0 N1?  ~ 0 N1? ~ 0 N1? P>  0 6L@0Pc   dKAliceB( ckR 0  `-1? O R 0 Z1?p R  0 ZZ1?H x R !0  `v;:G&1?2 b  R "0 Z1?X  R #0 ZZ1? 0 R $0 Zp1?  R %0 ZY1? O R &0 ZL1?p k H 0 0޽h ?  0 000000000000000 00000&00 000 X(=^|___PPT10\.d`1`+D0' =+Z  (  r  S @@  @ l  C @` @ H  0޽h ? X(=^tl___PPT10L+D0' =+b  &( ^4* 0} r  S |n @  @ l  C , P    |  T1?p|  T1?` H  0޽h ? X(=^tl___PPT10L+D0' =+p  4(  r  S ,@@  @   C B0 @ "F H  0޽h ? X(=^tl___PPT10L+D0' =+l  0( ( l  C D@  @   S 0    LH  0޽h ? X(=^tl___PPT10L+D0' =+Z   (  r  S D} @   l  C ~    H  0޽h ? X(=^tl___PPT10L+D0' =+J0  @( %H7 &L    H1 ?OE     # l t<t< ? L    H  0j ? a(rPr1z8gq _m PTx¬*&*Pb^fh&0Jk`,0)ҧJ146 s F o7Xg(5   )Fhttp://www.cl.cam.ac.uk/users/rja14Fhttp://www.cl.cam.ac.uk/users/rja14/ 0Courier New MS Mincho sidebarc$Computer Security in the Real WorldSecurity: The Goal Real-World SecurityElements of SecurityDangersVulnerabilitiesDefensive strategiesThe Access Control ModelMechanismsThe Gold Standard Assurance: Making Security WorkAssurance: ConfigurationAssurance: Defense in Depth"Why We Dont Have Real Security#Standard Operating System SecurityEnd-to-End SecurityEnd-to-End example Principals Speaks ForAuthenticating Channels Authenticating Names: SDSI/SPKIAuthenticating GroupsAuthorization with ACLsEnd-to-End Example: Summary Compatibility with Local OS?Authenticating Systems AuditingImplement: Tools and Assurance References  Fonts UsedDesign Template Slide Titles 8@ _PID_HLINKSA$http://www.cl.cam.ac.uk/users/rja14#Computer Security in the Real World&_aButler LampsonButler Lampson@Times New Roman-. 2 &Butler Lampson.-@Times New Roman-. 2 !, MicrosoftmDTimes New RomanХ+A 0%0DMonotype SortsХ+A 0%0 DBook AntiquasХ+A 0%00DSymboltiquasХ+A 0%0@DHelveticauasХ+A 0%0 "PDCourier NewsХ+A 0%01`DMS MinchowsХ+A 0%01 A .@  @@``  @n?" dd@  @@`` B:x<u          CPJ+(SJR(!s AA1? @g4BdBd%0Eppp@  <4!d!d,T0h<4BdBd,T0h4ʚ;9ʚ;<4dddd,U0l0___PPT10 00,___PPT9/ 0# 4 @? %O =Z#Computer Security in the Real World&$#((Butler Lampson MicrosoftSecurity: The Goal hComputers are as secure as real world systems, and people believe it. This is hard because: Computers can do a lot of damage fast. There are many places for things to go wrong. Networks enable Anonymous attacks from anywhere Automated infection Hostile code and hostile hosts People don t trust new things.f]fS D((fS Real-World SecurityXIt s about value, locks, and punishment. Locks good enough that bad guys don t break in very often. Police and courts good enough that bad guys that do break in get caught and punished often enough. Less interference with daily life than value of loss. Security is expensive buy only what you need.H*  -. -*.,#Elements of Security Policy: Specifying security What is it supposed to do? Mechanism: Implementing security How does it do it? Assurance: Correctness of security Does it really work?   '    $ Dangers$ vVandalism or sabotage that damages information disrupts service Theft of money Theft of information Loss of privacy B & 5 &5VulnerabilitiesBad (buggy or hostile) programs Bad (careless or hostile) people giving instructions to good programs Bad guy interfering with communicationsB  @Defensive strategiesKeep everybody out Isolation Keep the bad guy out Code signing, firewalls Let him in, but keep him from doing damage Sandboxing, access control Catch him and prosecute him Auditing, police    +     +  The Access Control Model*Guards control access to valued resources.+@p+ 8Mechanisms The Gold StandardAuthenticating principals Mainly people, but also channels, servers, programs Authorizing access. Usually for groups of principals Auditing Assurance Trusted computing base  4 - ! -    4  !  -$Assurance: Making Security WorkTrusted computing base Limit what has to work to ensure security Ideally, TCB is small and simple Includes hardware and software Also includes configuration, usually overlooked What software has privileges Database of users, passwords, privileges, groups Network information (trusted hosts, & ) Access controls on system resources . . . The unavoidable price of reliability is simplicity. Hoare*!O: 2*!O 3.%Assurance: ConfigurationUsers keep it simple At most three levels: self, friends, others Three places to put objects Everything else done automatically with policies Administrators keep it simple Work by defining policies. Examples: Each user has a private home folder Each user belongs to one workgroup with a private folder System folders contain vendor-approved releases All executable programs are signed by a trusted party Today s systems don t support this very well,1%-,1% - Assurance: Defense in DepthNetwork, with a firewall Operating system, with sandboxing Basic OS (such as NT) Higher-level OS (such as Java) Application that checks authorization directly All need authentication<; 5 H ;5H BWhy We Don t Have  Real SecurityA. People don t buy it: Danger is small, so it s OK to buy features instead. Security is expensive. Configuring security is a lot of work. Secure systems do less because they re older. Security is a pain. It stops you from doing things. Users have to authenticate themselves. B. Systems are complicated, so they have bugs. L U  -G 0 LUG .#"Standard Operating System SecurityrAssume secure channel from user (without proof) Authenticate user by local password Assign local user   !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~Root EntrydO)Hxe  Current User2JSummaryInformation(PowerPoint Document(aDocumentSummaryInformation8DTimes New RomanХ+A 0%0DMonotype SortsХ+A 0%0 DBook AntiquasХ+A 0%00DSymboltiquasХ+A 0%0@DHelveticauasХ+A 0%0 "PDCourier NewsХ+A 0%01`DMS MinchowsХ+A 0%01 A .@  @@``  @n?" dd@  @@`` B:x<u          CPJ+(SJR(!s AA1? @g4BdBd%0ppp@  <4!d!d,T0h<4BdBd,T0h4ʚ;9ʚ;<4dddd,U0l0___PPT10 00,___PPT9/ 0# 4 @? %O =?Z#Computer Security in the Real World&$#((Butler Lampson MicrosoftSecurity: The Goal hComputers are as secure as real world systems, and people believe it. This is hard because: Computers can do a lot of damage fast. There are many places for things to go wrong. Networks enable Anonymous attacks from anywhere Automated infection Hostile code and hostile hosts People don t trust new things.f]fS D((fS Real-World SecurityXIt s about value, locks, and punishment. Locks good enough that bad guys don t break in very often. Police and courts good enough that bad guys that do break in get caught and punished often enough. Less interference with daily life than value of loss. Security is expensive buy only what you need.H*  -. -*. ,#Elements of Security Policy: Specifying security What is it supposed to do? Mechanism: Implementing security How does it do it? Assurance: Correctness of security Does it really work?   '    $ Dangers$ vVandalism or sabotage that damages information disrupts service Theft of money Theft of information Loss of privacy B & 5 &5VulnerabilitiesBad (buggy or hostile) programs Bad (careless or hostile) people giving instructions to good programs Bad guy interfering with communicationsB  @Defensive strategiesKeep everybody out Isolation Keep the bad guy out Code signing, firewalls Let him in, but keep him from doing damage Sandboxing, access control Catch him and prosecute him Auditing, police    +     +  The Access Control Model*Guards control access to valued resources.+@p+ 8Mechanisms The Gold StandardAuthenticating principals Mainly people, but also channels, servers, programs Authorizing access. Usually for groups of principals Auditing Assurance Trusted computing base  4 - ! -    4  !  -$Assurance: Making Security WorkTrusted computing base Limit what has to work to ensure security Ideally, TCB is small and simple Includes hardware and software Also includes configuration, usually overlooked What software has privileges Database of users, passwords, privileges, groups Network information (trusted hosts, & ) Access controls on system resources . . . The unavoidable price of reliability is simplicity. Hoare*!O: 2*!O 3.%Assurance: ConfigurationlUsers keep it simple At most three levels: self, friends, others Three places to put objects Everything else done automatically with policies Administrators keep it simple Work by defining policies. Examples: Each user has a private home folder Each user belongs to one workgroup with a private folder System folders contain vendor-approved releases All executable programs are signed by a trusted party Today s systems don t support this,1%#,1% # Assurance: Defense in DepthNetwork, with a firewall Operating system, with sandboxing Basic OS (such as NT) Higher-level OS (such as Java) Application that checks authorization directly All need authentication<; 5 H ;5H BWhy We Don t Have  Real SecurityA. People don t buy it: Danger is small, so it s OK to buy features instead. Security is expensive. Configuring security is a lot of work. Secure systems do less because they re older. Security is a pain. It stops you from doing things. Users have to authenticate themselves. B. Systems are complicated, so they have bugs. L U  -G 0 LUG .#"Standard Operating System SecurityrAssume secure channel from user (without proof) Authenticate user by local password Assign local user and group SIDs Access control by ACLs: lists of SIDs and permissions Reference monitor is the OS, or any RPC target Domains: same, but authenticate by RPC to controller Web servers: same, but simplified Establish secure channel with SSL Authenticate user by local password (or certificate) ACL on right to enter, or on user s private stateT! 6/ W T!&/'    1'End-to-End SecurityAuthenticate secure channels Work uniformly between organizations Microsoft can securely accept Intel s authentication Groups can have members from different organizations Delegate authority to groups or systems Audit all security decisions6BjEBjE4)End-to-End exampleAlice is at Intel, working on Atom, a joint Intel-Microsoft project Alice connects to Spectra, Atom s web page, with SSL Chain of responsibility: KSSL Ktemp KAlice Alice@Intel Atom@Microsoft r/w Spectray B0n  HHH  PrincipalsRAuthentication: Who sent a message? Authorization: Who is trusted? Principal  abstraction of  who : People Alice, Bob Services microsoft.com, Exchange Groups UW-CS, MS-Employees Secure channels key #678532E89A7692F, console Principals say things:  Read file foo  Alice s key is #678532E89A7692F ^$  " 2|  2 C"          $ Speaks For Principal A speaks for B: A T B Meaning: if A says something in set T, B says it too. Thus A is stronger than B, or responsible for B, about T Examples Alice Atom group of people Server-1 Spectra group of servers Key #7438 Alice key for Alice Delegation rule: If A says  B A then B A We trust A to delegate its own authority. Why should A delegate to B? Needs case by case analysis. Need a secure channel from A for  A says Easy if A is a key.! 6 9  i .* 9 *   I                                     6+Authenticating ChannelsChain of responsibility: KSSL Ktemp KAlice Alice@Intel & Ktemp says KAlice says (SSL setup) (via smart card)90n0 JJJ JBJ$B&Authenticating Names: SDSI/SPKI A name is in some name space, defined by a key The key speaks for any name in its name space KIntel KIntel / Alice (which is just Alice@Intel) KIntel says & Ktemp KAlice Alice@Intel & / . 8 0n#0/          HH Authenticating GroupsA group is a principal; its members speak for it Alice@Intel Atom@Microsoft Bob@Microsoft Atom@Microsoft & Evidence for groups: Just like names and keys. & KAlice Alice@Intel Atom@Microsoft r/w & z1> //0 1 /J Authorization with ACLsView a resource object O as a principal An ACL entry for P means P can speak for O Permissions limit the set of things P can say for O If Spectra s ACL says Atom can r/w, that means Spectra says & Alice@Intel Atom@Microsoft r/w Spectra( + 4 /  , $     7,End-to-End Example: Summary Request on SSL channel: KSSL says  read Spectra Chain of responsibility: KSSL Ktemp KAlice Alice@Intel Atom@Microsoft r/w Spectra KC0 nLH    !"#$%&'()*+,-./0123456789:;<=HH ( Compatibility with Local OS?4(1) Put network principals on OS ACLs (2) Let network principal speak for local one Alice@Intel Alice@microsoft Use network authentication replacing local or domain authentication Users and ACLs stay the same (3) Assign SIDs to network principals Do this automatically Use network authentication as beforeT:)&;T )&; Authenticating Systems A digest X can authenticate a program Word: KMicrosoft says  If image I has digest X then I is Word formally X KMicrosoft / Word A system N can speak for another system Word: KMicrosoft says N KMicrosoft / Word The first cert makes N want to run I if N likes Word, and it makes N assert the running I is Word The second cert lets N convince others that N is authorized to run Word>,Z . '0x        !AuditingChecking access: Given a request KAlice says  read Spectra an ACL Atom may r/w Spectra Check KAlice speaks KAlice Atom for Atom rights suffice r/w read Auditing: Each step is justified by A signed statement (certificate), or A delgation rule % 6 H HH 6 "Implement: Tools and Assurance,Gold standard Authentication Who said it? Authorization Who is trusted? Auditing What happened? End-to-end authorization Principals: keys, names, groups Speaks for: delegation, chain of responsibility Assurance: Trusted computing base Keep it small and simple. Include configuration, not just code. pS  P # p@0Z P!@ ' ReferencesWhy  real security is hard Ross Anderson: www.cl.cam.ac.uk/users/rja14 Bruce Schneier, Secrets and Lies Distributed system security Lampson et al. TOCS 10, 4 (Nov. 1992) Wobber et al. TOCS 12, 1 (Feb. 1994) Simple Distributed Security Infrastructure (SDSI) theory.lcs.mit.edu/~cis/sdsi.html Simple Public Key Infrastructure (SPKI) www.cis.ohio-state.edu/htbin/rfc/rfc2693.html BM K 2" (/ 2!( -L+!XQ0+GQ0hiQ0#/P`  p8$(  8r 8 S d%@   r 8 S 8&0  H 8 0޽h ? X(=^tl___PPT10L+D0' =+rp F p76g(5   )Fhttp://www.cl.cam.ac.uk/users/rja14Fhttp://www.cl.cam.ac.uk/users/rja14/ 0  !"#$%&'()*+,-./013Oh+'04 LX |   $Computer Security in the Real WorldompButler Lampsontutlutl+c:\powerpnt\template\clrovrhd\sidebarc.pptiButler Lampsonp77lMicrosoft PowerPoint 4.0vrh@86@w0@{pڇ@@:1xe"G&;   <  Rm-- $QlQl--'=--% _ --'@Times New Roman-.  2 Oi1."System-@Book Antiqua-. X<2 #Computer Security in the Real World.-@Times New Roman-. 2 &Butler Lampson.-@Times New Roman-. 2 !, Microsoftm.- ՜.+,D՜.+,@    Letter Paper (8.5x11 in). MicrosofteraG $Times New RomanMonotype Sorts Book AntiquaSymbol Helvetica and group SIDs Access control by ACLs: lists of SIDs and permissions Reference monitor is the OS, or any RPC target Domains: same, but authenticate by RPC to controller Web servers: same, but simplified Establish secure channel with SSL Authenticate user by local password (or certificate) ACL on right to enter, or on user s private stateT! 6/ W T!&/'    1'End-to-End SecurityAuthenticate secure channels Work uniformly between organizations Microsoft can securely accept Intel s authentication Groups can have members from different organizations Delegate authority to groups or systems Audit all security decisions6BjEBjE4)End-to-End exampleAlice is at Intel, working on Atom, a joint Intel-Microsoft project Alice connects to Spectra, Atom s web page, with SSL Chain of responsibility: KSSL Ktemp KAlice Alice@Intel Atom@Microsoft r/w Spectray B0n  HHH  PrincipalsRAuthentication: Who sent a message? Authorization: Who is trusted? Principal  abstraction of  who : People Alice, Bob Services microsoft.com, Exchange Groups UW-CS, MS-Employees Secure channels key #678532E89A7692F, console Principals say things:  Read file foo  Alice s key is #678532E89A7692F ^$  " 2|  2 C"          $ Speaks For Principal A speaks for B: A T B Meaning: if A says something in set T, B says it too. Thus A is stronger than B, or responsible for B, about T Examples Alice Atom group of people Key #7438 Alice key for Alice Delegation rule: If A says  B A then B A We trust A to delegate its own authority. Why should A delegate to B? Needs case by case analysis. Need a secure channel from A for  A says Easy if A is a key. Channel can be off-line (certificate) or on-line (Kerberos)n! 6 9  C .* 9 * P  I                           G6+Authenticating ChannelsChain of responsibility: KSSL Ktemp KAlice Alice@Intel & Ktemp says KAlice says (SSL setup) (via smart card)90n0 JJJ JBJ$B&Authenticating Names: SDSI/SPKI A name is in some name space, defined by a key The key speaks for any name in its name space KIntel KIntel / Alice (which is just Alice@Intel) KIntel says & Ktemp KAlice Alice@Intel & / . 8 0n#0/          HH Authenticating GroupsA group is a principal; its members speak for it Alice@Intel Atom@Microsoft Bob@Microsoft Atom@Microsoft & Evidence for groups: Just like names and keys. & KAlice Alice@Intel Atom@Microsoft r/w & |1> //0 1 /J Authorization with ACLsView a resource object O as a principal An ACL entry for P means P can speak for O Permissions limit the set of things P can say for O If Spectra s ACL says Atom can r/w, that means Spectra says & Alice@Intel Atom@Microsoft r/w Spectra( + 4 /  , $     7,End-to-End Example: Summary Request on SSL channel: KSSL says  read Spectra Chain of responsibility: KSSL Ktemp KAlice Alice@Intel Atom@Microsoft r/w Spectra KC0 nLHHH ( Compatibility with Local OS?4(1) Put network principals on OS ACLs (2) Let network principal speak for local one Alice@Intel Alice@microsoft Use network authentication replacing local or domain authentication Users and ACLs stay the same (3) Assign SIDs to network principals Do this automatically Use network authentication as beforeT:)&;T )&; Authenticating Systems A digest X can authenticate a program Word: KMicrosoft says  If image I has digest X then I is Word formally X KMicrosoft / Word A system N can speak for another system Word: KMicrosoft says N KMicrosoft / Word The first cert makes N want to run I if N likes Word, and it makes N assert the running I is Word The second cert lets N convince others that N is authorized to run Word>,Z . '0x        !AuditingChecking access: Given a request KAlice says  read Spectra an ACL Atom may r/w Spectra Check KAlice speaks KAlice Atom for Atom rights suffice r/w read Auditing: Each step is justified by A signed statement (certificate), or A delgation rule % 6 H HH 6 "Implement: Tools and Assurance,Gold standard Authentication Who said it? Authorization Who is trusted? Auditing What happened? End-to-end authorization Principals: keys, names, groups Speaks for: delegation, chain of responsibility Assurance: Trusted computing base Keep it small and simple. Include configuration, not just code. pS  P # p@0Z P!@ ' ReferencesWhy  real security is hard Ross Anderson: www.cl.cam.ac.uk/users/rja14 Bruce Schneier, Secrets and Lies Distributed system security Lampson et al. TOCS 10, 4 (Nov. 1992) Wobber et al. TOCS 12, 1 (Feb. 1994) Simple Distributed Security Infrastructure (SDSI) theory.lcs.mit.edu/~cis/sdsi.html Simple Public Key Infrastructure (SPKI) www.cis.ohio-state.edu/htbin/rfc/rfc2693.html BM K 2" (/ 2!( -L+!XQ0+GQ0hiQ0#/Pp  $(  r  S 0@   r  S P  H  0޽h ? X(=^|___PPT10\.yde+D0' =+l  `0(  r  S $@   ~  C P:     H  0޽h ? X(=^tl___PPT10L+D0' =+  &I(  r  S >@  >   S 1>PPP >  N[F   I   ' 6L3>'j X  Lsays,( c ( 67>(Px   s Spectra ACLL ( cc c ) 6g)> ^ 6 bKSSLB( ck * 6h*`P   Lsays,( c + 6l+0 p  Lsays,( c , 0q,"`( l$Alice s smart card,( c - s *t-H   p(Alice s login system,( c . s *|?>.~ P xSpectra web pageL( cc c / 6D>/P p cKtempB( ck 0 6LI>00 p dKAliceB( ck 1 6M>1 x  S Alice@Intel, (  c 2 6Q>2 x  VAtom@Microsoft,( c2 3 0(V>3 X  Q Microsoft, (  c2 4 0 Z>4   MIntel,( cx 5 HԔ? R~ Sx 6 H>? ( x 7 H>?(P (x 8 H>?x H x 9 H>? 0  0 x : H>?Lx ~ ; N1?p R ~ < N1? X  ~ = N1? P  > 6T`>>P p dKAliceB( ckR ?  `-1?  R @ Z1?p  >R A ZZ1?H :x jR B  `v;:G&1?2 ^ b R C Z1?X@R D ZZ1?  0 H R E Zp1?( X R F ZY1?R G ZL1?p  | H T1?pP H  0޽h ?  -.5 /)6 0/7 0181292.:4;3G<?(= X(=^tl___PPT10L+D0' =+r DG.(B a7