Tags:
create new tag
, view all tags
Interesting (?) resources on software economics.

See AboutThesePages.

Contents

Security Related

According to one common view, information security comes down to technical measures. Given better access control policy models, formal proofs of cryptographic protocols, approved firewalls, better ways of detecting intrusions and malicious code, and better tools for system evaluation and assurance, the problems can be solved. In this note, I put forward a contrary view: information insecurity is at least as much due to perverse incentives. Many of the problems can be explained more clearly and convincingly using the language of microeconomics: network externalities, asymmetric information, moral hazard, adverse selection, liability dumping and the tragedy of the commons.

Interesting paper -- worth reading (would like to get Alex to read) -- incidental to the main points of the article, it gives some insights into what business (and surviving in the world) is really about (am I being cynical?). I should (and may have) known this already, but it helps make it clear that business is not about providing the greatest good for the greatest number, but more about competition, even at the expense of efficiency. (Capitalism (and democracy) may be, as is often said, "not perfect, but the best we've got" (or as I'd prefer, "the best we've developed so far"), but it does make one long for something better.)

(Aside: One of my "theories" (or fears, or misconceptions, or //) for a long time has been (but not expressed in these words before) -- if we drive all the inefficiencies out of the system, there may not be enough jobs to go around. (Then what do we do?)

I had some other interesting insights (or it gave me some insigts) while I was reading the paper -- one related to the idea that educated people in any field (even economics -- I assume that the author is an economist) can come up with ideas / theories that are applicable to far more than their own narrow field of study, although they may be couched in the vocabulary of their speciality and thus can be difficult (and due to NIH, not desirable) for those in other specialties to recognize and accept, but rather have those in other fields rediscover those theories independently. (I really should reread the article, then rewrite these notes.)

(This is being written the day after reading the article and I forget all the insights I got from the article -- would have to read it again, and should.)

Also, looks like there may be some interesting articles in the bibliography (References).

I was going to copy and paste some key things here, including the conclusion and separately, the main thesis of the article (not covered in the conclusion), but since the paper is in PDF it is not easy to cut and paster. The "main thesis": "In general, where the party who is in a position to protect a system is not the party who would suffer the results of a security failure, then problems may be expected."

IMHO, the paper is definitely worth reading, reading again, and starting to evangelize some of the points (like the quote in the previous paragraph).


Contributors

  • RandyKramer - 23 Apr 2002
  • <If you edit this page, add your name here, move this to the next line>

Page Ratings

Topic revision: r1 - 2002-04-23 - RandyKramer
 
  • Learn about TWiki  
  • Download TWiki
This site is powered by the TWiki collaboration platform Powered by PerlCopyright 1999-2017 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding WikiLearn? WebBottomBar">Send feedback
See TWiki's New Look