411 Matching Annotations
  1. Feb 2014
    1. THE ELEMENTS OF BRIEFING Procedural History Legal Issue Facts of Case Statement of Rule Policy Dicta Reasoning Holding Concurrence Dissents

      The Elements of Briefing

      • Procedural History
      • Legal Issue
      • Facts of Case
      • Statement of Rule
      • Policy
      • Dicta
      • Reasoning
      • Holding
      • Concurrence
      • Dissents
    1. Reasoning The reasoning gives the reader insight into how the court arrived at its decision. It is instructive in nature. Courts often back their holdings with several lines of reasoning, each of which should be summarized in this section. Unnecessary repetition of facts or the issue should be avoided. A court�s rationale for its holding might be a simple explanation of its thought process. Alternatively, the reasoning might be based on the plain language of the statute, Congressional intent, the re-enactment doctrine, or other common means of resolving judicial disputes.

      Several lines of reasoning may be used to back the Court's holdings and may be:

      • a simple explanation of the Court's thought processes
      • based on the plain language of the statute
      • congressional intent
      • re-enactment doctrine
      • other common means of resolving judicial disputes (what are those?)
    2. Holding As the issue�s complement, the holding consists of two parts: (1) a “yes” or “no” conclusion to the brief�s issue and (2) the rule of law the court establishes. The rule of law is a guidepost that courts use to decide future cases based on the legal concept of stare decisis (judicial tendency to follow prior decisions).

      The holding has two parts:

      1) A decision on the legal issue (yes/no)

      2) The rule of law the court establishes

    3. Beginning the issue with “are” or “is” often leads to a clearer and more concise expression of the issue than beginning it with “may,” “can,” “does,” or “should.” The latter beginnings may lead to vague or ambiguous versions of the issue. Examine the following alternative statements of the judicial issue from Aiken Industries, Inc. (TC, 1971), acq.: Issue 2 (Poor): Are the interest payments exempt from the withholding tax? Issue 2 (Poor): Should the taxpayer exempt the interest payments from withholding tax? In the first version of issue 2 above, to which interest payments and which withholding tax is the writer referring? The issue does not stand alone since it cannot be precisely understood apart from separately reading the brief�s facts. The extreme brevity leads to ambiguity. In the second version, the question can be interpreted as a moral or judgment issue rather than a legal one. Whether the taxpayer should do (or should not do) something may be a very different issue than the legal question of what the law requires. A legal brief, however, should focus on the latter. Rewriting issue 2 as follows leads to a clearer expression of the precise issue: Issue 2 (Better): Are interest payments exempt from the U.S. 30% withholding tax when paid to an entity established in a tax treaty country for no apparent purpose other than to escape taxation on the interest received?

      Extreme brevity leads to ambiguity. The summary of the issue should be written to avoid opening the question to interpretation as a moral or judgment issue; instead focus on the legal question.

    4. Issues should be stated so that they “stand alone.” That is, issues should be completely understandable without reference to the facts or other sections of the brief or judicial decision. Use of the definite article “the” indicates that the issue does not stand alone when it alludes to prior information.

      The summary of the issue should "stand alone" or be self-contained such that enough context and background is included in the summary to not have to refer to the document it came from.

      I think this is an important pattern to use elsewhere, as well.

    1. dural issue : What is the appealing party claiming the lower court did wrong (e.g., ruling on evidence, jury instructions, granting of summary judgment, etc.)?

      Procedural issue. What is the appealing party claimin ghte lower court did wrong:

      • ruling on evidence
      • jury instructions
      • granting of summary judgment
    2. antive issue : A substantive statement of the issue consists of two parts -- i. the point of law in dispute ii. the key facts of the case re lating to that point of law in dispute (legally relevant facts) You must include the key facts from the case so that the issue is specific to that case. Typically, the disputed issue involves how the court applied some element of the pertinent rule to the facts of the specific case. Resolving the issue will determine the court’s disposition of the case.
      • the point of law in dispute
      • the key facts of the case relating to that point of law in dispute (legally relevant facts)
    3. b. Identify legally relevant facts, t hat is, those facts that tend to prove or disprove an issue before the court. The relevant facts tell what happened before the parties enter ed the judicial system. c. Identify procedurally significant facts. You should set out (1) the cause of action (C/A) (the law the plaintiff claimed was broken), (2) relief the plaintiff requested, (3) defenses, if any, the defendant raised.
    4. Identify the relationship/status of the parties (Note: Do not merely refer to the parties as the plai ntiff/defendant or appellant/appellee; be sure to also include more descr iptive generic terms to identify the relationship/status at issue, e.g., buyer/seller, employer/employee, landlord/tenant, etc.)

      Identify the factual relationship of the parties, not just the procedural relationship.

      Examples of procedural:

      • plaintiff/defendant
      • appellant/appellee

      Examples of factual:

      • buyer/seller
      • employer/employee
      • landlord/tenant
    5. Functions of case briefing A. Case briefing helps you acquire the skills of case analysis and legal reasoning. Briefing a case helps you understand it. B. Case briefing aids your memory. Briefs help you remember the cases you read (1) for class discussion, (2) fo r end-of-semester review for final examinations, and (3) for writing and analyzing legal problems.

      Briefing a case helps you understand it and acquire skills of:

      • case analysis
      • legal reasoning

      Case briefing is good for:

      • aids memory
      • class discussion
      • end-of-semester review for final exams
      • writing and analyzing legal problems
    6. nctions A. A case brief is a dissection of a judici al opinion -- it contains a written summary of the basic components of that decision. B. Persuasive briefs (trial and appella te) are the formal documents a lawyer files with a court in support of his or her client’s position

      Distinctions

    1. A CAUTIONARY NOTE Don’t brief the case until you have read it through at least once. Don’t think that because you have found the judge’s best purple prose you have necessarily extracted the essence of the decision. Look for unarticulated premises, logical fallacies, manipulation of the factual record, or distortions of precedent. Then ask, How does this case relate to other cases in the same general area of law? What does it show about judicial policymaking? Does the result violate your sense of justice or fairness? How might it have been better decided?

      Read the case to identify:

      • unarticulated premises
      • logical fallacies
      • manipulation of the factual record
      • distortions of precedent.

      Then ask:

      • How does this case relate to other cases in the same general area of law?

      • What does it show about judicial policymaking?

      • Does the result violate your sense of justice or fairness?

      • How might it have been better decided?

    1. T h i s p r o t e c t i o n i s s u b j e c t t o a n i m p o r t a n t l i m i t a t i o n . T h e m e r e f a c t t h a t a w o r k i s c o p y r i g h t e d d o e s n o t m e a n t h a t e v e r y e l e m e n t o f t h e w o r k m a y b e p r o t e c t e d . O r i g i n a l i t y r e m a i n s t h e s i n e q u a n o n o f c o p y r i g h t ; a c c o r d i n g l y , c o p y r i g h t p r o t e c t i o n m a y e x t e n d o n l y t o t h o s e c o m p o n e n t s o f a w o r k t h a t a r e o r i g i n a l t o t h e a u t h o r . P a t t e r s o n & J o y c e 8 0 0 - 8 0 2 ; G i n s b u r g , C r e a t i o n a n d C o m m e r c i a l V a l u e : C o p y r i g h t P r o t e c t i o n o f W o r k s o f I n f o r m a t i o n , 9 0 C o l u m . L . R e v . 1 8 6 5 , 1 8 6 8 , a n d n . 1 2 ( 1 9 9 0 ) ( h e r e i n a f t e r G i n s b u r g ) .

      Though a work may be copyrightable it does not mean that every element of the work may be protectable.

    2. F a c t u a l c o m p i l a t i o n s , o n t h e o t h e r h a n d , m a y p o s s e s s t h e r e q u i s i t e o r i g i n a l i t y .

      Factual compilations may possess the requisite originality and so may be copyrightable.

    3. C e n s u s t a k e r s , f o r e x a m p l e , d o n o t " c r e a t e " t h e p o p u l a t i o n f i g u r e s t h a t e m e r g e f r o m t h e i r e f f o r t s ; i n a s e n s e , t h e y c o p y t h e s e f i g u r e s f r o m t h e w o r l d a r o u n d t h e m . D e n i c o l a , C o p y r i g h t i n C o l l e c t i o n s o f F a c t s : A T h e o r y f o r t h e P r o t e c t i o n o f N o n f i c t i o n L i t e r a r y W o r k s , 8 1 C o l u m . L . R e v . 5 1 6 , 5 2 5 ( 1 9 8 1 ) ( h e r e i n a f t e r D e n i c o l a ) . C e n s u s d a t a t h e r e f o r e d o n o t t r i g g e r c o p y r i g h t b e c a u s e t h e s e d a t a a r e n o t " o r i g i n a l " i n t h e c o n s t i t u t i o n a l s e n s e . N i m m e r § 2 . 0 3 [ E ] . T h e s a m e i s t r u e o f a l l f a c t s — s c i e n t i f i c , h i s t o r i c a l , b i o g r a p h i c a l , a n d n e w s o f t h e d a y . " [ T ] h e y m a y n o t b e c o p y r i g h t e d a n d a r e p a r t o f t h e p u b l i c d o m a i n a v a i l a b l e t o e v e r y p e r s o n . " M i l l e r , s u p r a , a t 1 3 6 9 .

      Census takers do not create; they merely copy the figured from the world around them. All facts-- scientific, historical, biographical, and news of the day-- may not be copyrighted and are part of the public domain.

    4. I t i s t h i s b e d r o c k p r i n c i p l e o f c o p y r i g h t t h a t m a n d a t e s t h e l a w ' s s e e m i n g l y d i s p a r a t e t r e a t m e n t o f f a c t s a n d f a c t u a l c o m p i l a t i o n s . " N o o n e m a y c l a i m o r i g i n a l i t y a s t o f a c t s . " I d . , § 2 . 1 1 [ A ] , p . 2 - 1 5 7 . T h i s i s b e c a u s e f a c t s d o n o t o w e t h e i r o r i g i n t o a n a c t o f a u t h o r s h i p . T h e d i s t i n c t i o n i s o n e b e t w e e n c r e a t i o n a n d d i s c o v e r y : T h e f i r s t p e r s o n t o f i n d a n d r e p o r t a p a r t i c u l a r f a c t h a s n o t c r e a t e d t h e f a c t ; h e o r s h e h a s m e r e l y d i s c o v e r e d i t s e x i s t e n c e . T o b o r r o w f r o m B u r r o w - G i l e s , o n e w h o d i s c o v e r s a f a c t i s n o t i t s " m a k e r " o r " o r i g i n a t o r . " 1 1 1 U . S . , a t 5 8 . " T h e d i s c o v e r e r m e r e l y f i n d s a n d r e c o r d s . " N i m m e r § 2 . 0 3 [ E ] .

      No one may claim originality to facts because facts do not owe their origin to an act of authorship. The distinction is one between creation vs discovery.

    5. T h e o r i g i n a l i t y r e q u i r e m e n t a r t i c u l a t e d i n T h e T r a d e - M a r k C a s e s a n d B u r r o w - G i l e s r e m a i n s t h e t o u c h s t o n e o f c o p y r i g h t p r o t e c t i o n t o d a y . S e e G o l d s t e i n v . C a l i f o r n i a , 4 1 2 U . S . 5 4 6 , 5 6 1 - 5 6 2 ( 1 9 7 3 ) . I t i s t h e v e r y " p r e m i s e o f c o p y r i g h t l a w . " M i l l e r v . U n i v e r s a l C i t y S t u d i o s , I n c . , 6 5 0 F . 2 d 1 3 6 5 , 1 3 6 8 ( C A 5 1 9 8 1 ) . L e a d i n g s c h o l a r s a g r e e o n t h i s p o i n t . A s o n e p a i r o f c o m m e n t a t o r s s u c c i n c t l y p u t s i t : " T h e o r i g i n a l i t y r e q u i r e m e n t i s c o n s t i t u t i o n a l l y m a n d a t e d f o r a l l w o r k s . "

      The originality requirement is the touchstone of copyright protection today.

    6. I n B u r r o w - G i l e s , t h e C o u r t d i s t i l l e d t h e s a m e r e q u i r e m e n t f r o m t h e C o n s t i t u t i o n ' s u s e o f t h e w o r d " a u t h o r s . " T h e C o u r t d e f i n e d " a u t h o r , " i n a c o n s t i t u t i o n a l s e n s e , t o m e a n " h e t o w h o m a n y t h i n g o w e s i t s o r i g i n ; o r i g i n a t o r ; m a k e r . " 1 1 1 U . S . , a t 5 8 ( i n t e r n a l q u o t a t i o n m a r k s o m i t t e d ) . A s i n T h e T r a d e - M a r k C a s e s , t h e C o u r t e m p h a s i z e d t h e c r e a t i v e c o m p o n e n t o f o r i g i n a l i t y . I t d e s c r i b e d c o p y r i g h t a s b e i n g l i m i t e d t o " o r i g i n a l i n t e l l e c t u a l c o n c e p t i o n s o f t h e a u t h o r , " 1 1 1 U . S . , a t 5 8 , a n d s t r e s s e d t h e i m p o r t a n c e o f r e q u i r i n g a n a u t h o r w h o a c c u s e s a n o t h e r o f i n f r i n g e m e n t t o p r o v e " t h e e x i s t e n c e o f t h o s e f a c t s o f o r i g i n a l i t y , o f i n t e l l e c t u a l p r o d u c t i o n , o f t h o u g h t , a n d c o n c e p t i o n . " I d . , a t 5 9 - 6 0 .

      In Burrow-Giles the court defined authors, in a constitutional sense, to mean "he to whom anything owes its origin, originator, maker" and emphasized the creative component of originality.

    7. I n T h e T r a d e - M a r k C a s e s , t h e C o u r t a d d r e s s e d t h e c o n s t i t u t i o n a l s c o p e o f " w r i t i n g s . " F o r a p a r t i c u l a r w o r k t o b e c l a s s i f i e d " u n d e r t h e h e a d o f w r i t i n g s o f a u t h o r s , " t h e C o u r t d e t e r m i n e d , " o r i g i n a l i t y i s r e q u i r e d . " 1 0 0 U . S . , a t 9 4 . T h e C o u r t e x p l a i n e d t h a t o r i g i n a l i t y r e q u i r e s i n d e p e n d e n t c r e a t i o n p l u s a m o d i c u m o f c r e a t i v i t y : " [ W ] h i l e t h e w o r d w r i t i n g s m a y b e l i b e r a l l y c o n s t r u e d , a s i t h a s b e e n , t o i n c l u d e o r i g i n a l d e s i g n s f o r e n g r a v i n g , p r i n t s , & c . , i t i s o n l y s u c h a s a r e o r i g i n a l , a n d a r e f o u n d e d i n t h e c r e a t i v e p o w e r s o f t h e m i n d . T h e w r i t i n g s w h i c h a r e t o b e p r o t e c t e d a r e t h e f r u i t s o f i n t e l l e c t u a l l a b o r , e m b o d i e d i n t h e f o r m o f b o o k s , p r i n t s , e n g r a v i n g s , a n d t h e l i k e . " I b i d . ( e m p h a s i s i n o r i g i n a l ) .

      In The Trade-Mark Cases the Court addressed the constitutional scope of writings saying for a particular work to be classified "under the head of writings of authors," the Court determined, "originality is required"; independent creation plus a modicum of creativity.

    8. O r i g i n a l i t y d o e s n o t s i g n i f y n o v e l t y ; a w o r k m a y b e o r i g i n a l e v e n t h o u g h i t c l o s e l y r e s e m b l e s o t h e r w o r k s s o l o n g a s t h e s i m i l a r i t y i s f o r t u i t o u s , n o t t h e r e s u l t o f c o p y i n g . T o i l l u s t r a t e , a s s u m e t h a t t w o p o e t s , e a c h i g n o r a n t o f t h e o t h e r , c o m p o s e i d e n t i c a l p o e m s . N e i t h e r w o r k i s n o v e l , y e t b o t h a r e o r i g i n a l a n d , h e n c e , c o p y r i g h t a b l e .

      See Sheldon v. Metro-Goldwyn Pictures Corp., 81 F. 2d 49, 54 (CA2 1936)

    9. O r i g i n a l i t y i s a c o n s t i t u t i o n a l r e q u i r e m e n t . T h e s o u r c e o f C o n g r e s s ' p o w e r t o e n a c t c o p y r i g h t l a w s i s A r t i c l e I , § 8 , c l . 8 , o f t h e C o n s t i t u t i o n , w h i c h a u t h o r i z e s C o n g r e s s t o " s e c u r [ e ] f o r l i m i t e d T i m e s t o A u t h o r s . . . t h e e x c l u s i v e R i g h t t o t h e i r r e s p e c t i v e W r i t i n g s . " I n t w o d e c i s i o n s f r o m t h e l a t e 1 9 t h c e n t u r y — T h e T r a d e - M a r k C a s e s , 1 0 0 U . S . 8 2 ( 1 8 7 9 ) ; a n d B u r r o w - G i l e s L i t h o g r a p h i c C o . v . S a r o n y , 1 1 1 U . S . 5 3 ( 1 8 8 4 ) — t h i s C o u r t d e f i n e d t h e c r u c i a l t e r m s " a u t h o r s " a n d " w r i t i n g s . " I n s o d o i n g , t h e C o u r t m a d e i t u n m i s t a k a b l y c l e a r t h a t t h e s e t e r m s p r e s u p p o s e a d e g r e e o f o r i g i n a l i t y .

      This Court defined the crucial terms authors and writings.

    10. T h e k e y t o r e s o l v i n g t h e t e n s i o n l i e s i n u n d e r s t a n d i n g w h y f a c t s a r e n o t c o p y r i g h t a b l e . T h e s i n e q u a n o n o f c o p y r i g h t i s o r i g i n a l i t y . T o q u a l i f y f o r c o p y r i g h t p r o t e c t i o n , a w o r k m u s t b e o r i g i n a l t o t h e a u t h o r . S e e H a r p e r & R o w , s u p r a , a t 5 4 7 - 5 4 9 . O r i g i n a l , a s t h e t e r m i s u s e d i n c o p y r i g h t , m e a n s o n l y t h a t t h e w o r k w a s i n d e p e n d e n t l y c r e a t e d b y t h e a u t h o r ( a s o p p o s e d t o c o p i e d f r o m o t h e r w o r k s ) , a n d t h a t i t p o s s e s s e s a t l e a s t s o m e m i n i m a l d e g r e e o f c r e a t i v i t y . 1 M . N i m m e r & D . N i m m e r , C o p y r i g h t § § 2 . 0 1 [ A ] , [ B ] ( 1 9 9 0 ) ( h e r e i n a f t e r N i m m e r ) . T o b e s u r e , t h e r e q u i s i t e l e v e l o f c r e a t i v i t y i s e x t r e m e l y l o w ; e v e n a s l i g h t a m o u n t w i l l s u f f i c e . T h e v a s t m a j o r i t y o f w o r k s m a k e t h e g r a d e q u i t e e a s i l y , a s t h e y p o s s e s s s o m e c r e a t i v e s p a r k , " n o m a t t e r h o w c r u d e , h u m b l e o r o b v i o u s " i t m i g h t b e . I d . , § 1 . 0 8 [ C ] [ 1 ] .

      The sine qua non of copyright is originality.

    11. T h e r e i s a n u n d e n i a b l e t e n s i o n b e t w e e n t h e s e t w o p r o p o s i t i o n s . M a n y c o m p i l a t i o n s c o n s i s t o f n o t h i n g b u t r a w d a t a — i . e . , w h o l l y f a c t u a l i n f o r m a t i o n n o t a c c o m p a n i e d b y a n y o r i g i n a l w r i t t e n e x p r e s s i o n . O n w h a t b a s i s m a y o n e c l a i m a c o p y r i g h t i n s u c h a w o r k ? C o m m o n s e n s e t e l l s u s t h a t 1 0 0 u n c o p y r i g h t a b l e f a c t s d o n o t m a g i c a l l y c h a n g e t h e i r s t a t u s w h e n g a t h e r e d t o g e t h e r i n o n e p l a c e . Y e t c o p y r i g h t l a w s e e m s t o c o n t e m p l a t e t h a t c o m p i l a t i o n s t h a t c o n s i s t e x c l u s i v e l y o f f a c t s a r e p o t e n t i a l l y w i t h i n i t s s c o p e
    12. i t i s b e y o n d d i s p u t e t h a t c o m p i l a t i o n s o f f a c t s a r e w i t h i n t h e s u b j e c t m a t t e r o f c o p y r i g h t . C o m p i l a t i o n s w e r e e x p r e s s l y m e n t i o n e d i n t h e C o p y r i g h t A c t o f 1 9 0 9 , a n d a g a i n i n t h e C o p y r i g h t A c t o f 1 9 7 6
    13. T h i s c a s e c o n c e r n s t h e i n t e r a c t i o n o f t w o w e l l - e s t a b l i s h e d p r o p o s i t i o n s . T h e f i r s t i s t h a t f a c t s a r e n o t c o p y r i g h t a b l e ; t h e o t h e r , t h a t c o m p i l a t i o n s o f f a c t s g e n e r a l l y a r e . E a c h o f t h e s e p r o p o s i t i o n s p o s s e s s e s a n i m p e c c a b l e p e d i g r e e . T h a t t h e r e c a n b e n o v a l i d c o p y r i g h t i n f a c t s i s u n i v e r s a l l y u n d e r s t o o d . T h e m o s t f u n d a m e n t a l a x i o m o f c o p y r i g h t l a w i s t h a t " [ n ] o a u t h o r m a y c o p y r i g h t h i s i d e a s o r t h e f a c t s h e n a r r a t e s . " H a r p e r & R o w , P u b l i s h e r s , I n c . v . N a t i o n E n t e r p r i s e s , 4 7 1 U . S . 5 3 9 , 5 5 6 ( 1 9 8 5 ) .

      The most fundamental axiom of copyright law is that "no author may copyright his ideas or the facts he narrates." Harper & Row, Publishers, Inc. v. Nation Enterprises, 471U. S.539,556 (1985).

    14. T h i s c a s e r e q u i r e s u s t o c l a r i f y t h e e x t e n t o f c o p y r i g h t p r o t e c t i o n a v a i l a b l e t o t e l e p h o n e d i r e c t o r y w h i t e p a g e s
    1. Alexander v. Haley, 460 F.Supp. 40 (S.D.N.Y. 1978)
    2. Lecture 1: The Foundations of Copyright Law

      Readings:

      • 17 U.S.C. 102
      • Feist Publications, Inc. v. Rural Telephone Service Co., 499 U.S. 340 (1991)
      • Mannion v. Coors Brewing Co., 377 F.Supp. 2d 444 (S.D.N.Y. 2005)
      • Alexander v. Haley, 460 F.Supp. 40 (S.D.N.Y. 1978)
  2. Jan 2014
    1. In addition, the results imply that there is a lack of awareness about the importance of metadata among the scientific community –at least in practice– which is a serious problem as their involvement is quite crucial in dealing with problems regarding data management.

      Is there any reasonable agreement about what the term metadata means or includes? For example, how important is the unit of measure to scientists (feet vs meters) and is that information considered metadata or simply an implied part inherent in the data itself?

    2. Less than half (45%) of the respondents are satisfied with their ability to integrate data from disparate sources to address research questions

      The most important take-away I see in this whole section on reasons for not making data electronically available is not mentioned here directly!

      Here are the raw numbers for I am satisfied with my ability to integrate data from disparate sources to address research questions:

      • 156 (12.2%) Agree Strongly
      • 419 (32.7%) Agree Somewhat
      • 363 (28.3%) Neither Agree nor Disagree
      • 275 (21.5%) Disagree Somewhat
      • 069 (05.4%) Disagree Strongly

      Of the people who are not satisfied in some way, how many of those think current data sharing mechanisms are sufficient for their needs?

      Of the ~5% of people who are strongly dissatisfied, how many of those are willing to spend time, energy, and money on new sharing mechanisms, especially ones that are not yet proven? If they are willing to do so, then what measurable result or impact will the new mechanism have over the status quo?

      Who feel that current sharing mechanisms stand in the way of publications, tenure, promotion, or being cited?

      Of those who are dissatisfied, how many have existing investment in infrastructure versus those who are new and will be investing versus those who cannot invest in old or new?

      10 years ago how would you have convinced someone they need an iPad or Android smartphone?

    3. Reasons for not making data electronically available. Regarding their attitudes towards data sharing, most of the respondents (85%) are interested in using other researchers' datasets, if those datasets are easily accessible. Of course, since only half of the respondents report that they make some of their data available to others and only about a third of them (36%) report their data is easily accessible, there is a major gap evident between desire and current possibility. Seventy-eight percent of the respondents said they are willing to place at least some their data into a central data repository with no restrictions. Data repositories need to make accommodations for varying levels of security or access restrictions. When asked whether they were willing to place all of their data into a central data repository with no restrictions, 41% of the respondents were not willing to place all of their data. Nearly two thirds of the respondents (65%) reported that they would be more likely to make their data available if they could place conditions on access. Less than half (45%) of the respondents are satisfied with their ability to integrate data from disparate sources to address research questions, yet 81% of them are willing to share data across a broad group of researchers who use data in different ways. Along with the ability to place some restrictions on sharing for some of their data, the most important condition for sharing their data is to receive proper citation credit when others use their data. For 92% of the respondents, it is important that their data are cited when used by other researchers. Eighty-six percent of survey respondents also noted that it is appropriate to create new datasets from shared data. Most likely, this response relates directly to the overwhelming response for citing other researchers' data. The breakdown of this section is presented in Table 13.

      Categories of data sharing considered:

      • I would use other researchers' datasets if their datasets were easily accessible.
      • I would be willing to place at least some of my data into a central data repository with no restrictions.
      • I would be willing to place all of my data into a central data repository with no restrictions.
      • I would be more likely to make my data available if I could place conditions on access.
      • I am satisfied with my ability to integrate data from disparate sources to address research questions.
      • I would be willing to share data across a broad group of researchers who use data in different ways.
      • It is important that my data are cited when used by other researchers.
      • It is appropriate to create new datasets from shared data.
    4. Data sharing practices. Only about a third (36%) of the respondents agree that others can access their data easily, although three-quarters share their data with others (see Table 11). This shows there is a willingness to share data, but it is difficult to achieve or is done only on request.

      There is a willingness, but not a way!

    5. Nearly one third of the respondents chose not to answer whether they make their data available to others. Of those who did respond, 46% reported they do not make their data electronically available to others. Almost as many reported that at least some of their data are available somehow, either on their organization's website, their own website, a national network, a global network, a personal website, or other (see Table 10). The high percentage of non-respondents to this question most likely indicates that data sharing is even lower than the numbers indicate. Furthermore, the less than 6% of scientists who are making “All” of their data available via some mechanism, tends to re-enforce the lack of data sharing within the communities surveyed.
    6. Adding descriptive metadata to datasets helps makes the dataset more accessible by others and into the future. Respondents were asked to indicate all metadata standards they currently use to describe their data. More than half of the respondents (56%) reported that they did not use any metadata standard and about 22% of respondents indicated they used their own lab metadata standard. This could be interpreted that over 78% of survey respondents either use no metadata or a local home grown metadata approach.

      Not surprising that roughly 80% use no or ad hoc metadata.

    7. Data reuse. Respondents were asked to indicate whether they have the sole responsibility for approving access to their data. Of those who answered this question, 43% (n=545) have the sole responsibility for all their datasets, 37% (n=466) have for some of their datasets, and 21% (n=266) do not have the sole responsibility.
    8. Policies and procedures sometimes serve as an active rather than passive barrier to data sharing. Campbell et al. (2003) reported that government agencies often have strict policies about secrecy for some publicly funded research. In a survey of 79 technology transfer officers in American universities, 93% reported that their institution had a formal policy that required researchers to file an invention disclosure before seeking to commercialize research results. About one-half of the participants reported institutional policies that prohibited the dissemination of biomaterials without a material transfer agreement, which have become so complex and demanding that they inhibit sharing [15].

      Policies and procedures are barriers, but there are many more barriers beyond that which get in the way first.

    9. data practices of researchers – data accessibility, discovery, re-use, preservation and, particularly, data sharing
      • data accessibility
      • discovery
      • re-use
      • preservation
      • data sharing
    1. The Data Life Cycle: An Overview The data life cycle has eight components: Plan : description of the data that will be compiled, and how the data will be managed and made accessible throughout its lifetime Collect : observations are made either by hand or with sensors or other instruments and the data are placed a into digital form Assure : the quality of the data are assured through checks and inspections Describe : data are accurately and thoroughly described using the appropriate metadata standards Preserve : data are submitted to an appropriate long-term archive (i.e. data center ) Discover : potentially useful data are located and obtained, along with the relevant information about the data ( metadata ) Integrate : data from disparate sources are combined to form one homogeneous set of data that can be readily analyzed Analyze : data are analyzed

      The lifecycle according to who? This 8-component description is from the point of view of only the people who obsessively think about this "problem".

      Ask a researcher and I think you'll hear that lifecycle means something like:

      collect -> analyze -> publish
      

      or a more complex data management plan might be:

      ask someone -> receive data in email -> analyze -> cite -> publish -> tenure
      

      To most people lifecycle means "while I am using the data" and archiving means "my storage guy makes backups occasionally".

      Asking people to be aware of the whole cycle outlined here is a non-starter, but I think there is another approach to achieve what we want... dramatic pause [to be continued]

      What parts of this cycle should the individual be responsible for vs which parts are places where help is needed from the institution?

    2. Data represent important products of the scientific enterprise that are, in many cases, of equivalent or greater value than the publications that are originally derived from the research process. For example, addressing many of the grand challenge scientific questions increasingly requires collaborative research and the reuse , integration, and synthesis of data.

      Who else might care about this other than Grand Challenge Question researchers?

    3. Journals and sponsors want you to share your data

      What is the sharing standard? What are the consequences of not sharing? What is the enforcement mechanism?

      There are three primary sharing mechanisms I can think of today: email, usb stick, and dropbox (née ftp).

      The dropbox option is supplanting ftp which comes from another era, but still satisfies an important niche for larger data sets and/or higher-volume or anonymous traffic.

      Dropbox, email and usb are all easily accessible parts of the day-to-day consumer workflow; they are all trivial to set up without institutional support or, importantly, permission.

      An email account is already provisioned by default for everyone or, if the institutional email offerings are not sufficient, a person may easily set up a 3rd-party email account with no permission or hassle.

      Data management alternatives to these three options will have slow or no adoption until the barriers to access and use are as low as email; the cost of entry needs to be no more than *a web browser, an email address, and no special permission required".

    4. An effective data management program would enable a user 20 years or longer in the future to discover , access , understand, and use particular data [ 3 ]. This primer summarizes the elements of a data management program that would satisfy this 20-year rule and are necessary to prevent data entropy .

      Who cares most about the 20-year rule? This is an ideal that appeals to some, but in practice even the most zealous adherents can't picture what this looks like in some concrete way-- except in the most traditional ways: physical paper journals in libraries are tangible examples of the 20-year rule.

      Until we have a digital equivalent for data I don't blame people looking for tenure or jobs for not caring about this ideal if we can't provide a clear picture of how to achieve this widely at an institutional level. For digital materials I think the picture people have in their minds is of tape backup. Maybe this is generational? New generations not exposed widely to cassette tapes, DVDs, and other physical media that "old people" remember, only then will it be possible to have a new ideal that people can see in their minds-eye.

    5. A key component of data management is the comprehensive description of the data and contextual information that future researchers need to understand and use the data. This description is particularly important because the natural tendency is for the information content of a data set or database to undergo entropy over time (i.e. data entropy ), ultimately becoming meaningless to scientists and others [ 2 ].

      I agree with the key component mentioned here, but I feel the term data entropy is an unhelpful crutch.

    6. data entropy Normal degradation in information content associated with data and metadata over time (paraphrased from [ 2 ]).

      I'm not sure what this really means and I don't think data entropy is a helpful term. Poor practices certainly lead to disorganized collections of data, but I think this notion comes from a time when people were very concerned about degradation of physical media on which data is stored. That is, of course, still a concern, but I think the term data entropy really lends itself as an excuse for people who don't use good practices to manage data and is a cover for the real problem which is a kind of data illiteracy in much the same way we also face computational illiteracy widely in the sciences. Managing data really is hard, but let's not mask it with fanciful notions like data entropy.

    7. Although data management plans may differ in format and content, several basic elements are central to managing data effectively.

      What are the "several basic elements?"

    8. By documenting your data and recommending appropriate ways to cite your data, you can be sure to get credit for your data products and their use

      Citation is an incentive. An answer to the question "What's in it for me?"

    9. This primer describes a few fundamental data management practices that will enable you to develop a data management plan, as well as how to effectively create, organize, manage, describe, preserve and share data

      Data management practices:

      • create
      • organize
      • manage
      • describe
      • preserve
      • share
    10. The goal of data management is to produce self-describing data sets. If you give your data to a scientist or colleague who has not been involved with your project, will they be able to make sense of it? Will they be able to use it effectively and properly?
    1. If federally funded research is going to broadly benefit society, it has to be widely accessible, not just to curious private citizens, but also to industries, private organizations, and federal, state, and local governments where scientific knowledge can help create new products, solve problems, educate students, and make policy decisions.

      It is The People who will most benefit from open access to federally funded research.

    2. Giving the public what it paid for sounds noble, but from where I sit, a scientist at a well-funded research university, ensuring that research papers are available to the public for free seems pointless.

      This seems to be a comment sentiment-- the open access arguments don't address the individual "what's in it for me?" question. And it is not wrong for people to be asking this question-- not just what benefits them, but also what misery are they in for if they start down this unknown (and possibly treacherous) path? It is the rare few intrepid leaders in this space that can see beyond the immediate benefits and risks-- that can see a new world of science that could exist and are willing to make the epicly dangerous journey along with their loyal argonauts who can withstand the siren song and sail safely through the academic scylla and charybdis.

    1. One respondent noted that NSF doesn't have an enforcement policy. This is presumably true of other mandate sources as well, and brings up the related and perhaps more significant problem that mandates are not always (if they are ever) accompanied by the funding required to satisfy them. Another respondent wrote that funding agencies expect universities to contribute to long-term data storage.
    2. Data management activities, grouped. The data management activities mentioned by the survey can be grouped into five broader categories: "storage" (comprising backup or archival data storage, identifying appropriate data repositories, day-to-day data storage, and interacting with data repositories); "more information" (comprising obtaining more information about curation best practices and identifying appropriate data registries and search portals); "metadata" (comprising assigning permanent identifiers to data, creating and publishing descriptions of data, and capturing computational provenance); "funding" (identifying funding sources for curation support); and "planning" (creating data management plans at proposal time). When the survey results are thus categorized, the dominance of storage is clear, with over 80% of respondents requesting some type of storage-related help. (This number may also reflect a general equating of curation with storage on the part of respondents.) Slightly fewer than 50% of respondents requested help related to metadata, a result explored in more detail below.

      Categories of data management activities:

      • storage
        • backup/archival data storage
        • identifying appropriate data repositories
        • day-to-day data storage
        • interacting with data repositories
      • more information
        • obtaining more information about curation best practices
        • identifying appropriate data registries
        • search portals
      • metadata
        • assigning permanent identifiers to data
        • creating/publishing descriptions of data
        • capturing computational provenance
      • funding
        • identifying funding sources for curation support
      • planning
        • creating data management plans at proposal time
    3. Locally and/or externally focused departments. These departments look almost exclusively to external repositories or locally-provided solutions. To the extent these solutions suffice, the departments may need little help from campus.

      Where do faculty and researchers turn to?

      Patrick:

      Sciences cluster around "me" category.

      Humanities clusters around "others" category.

      Highlight by Chris during today's discussion.

    4. Data management activities, grouped. The data management activities mentioned by the survey can be grouped into five broader categories: "storage" (comprising backup or archival data storage, identifying appropriate data repositories, day-to-day data storage, and interacting with data repositories); "more information" (comprising obtaining more information about curation best practices and identifying appropriate data registries and search portals); "metadata" (comprising assigning permanent identifiers to data, creating and publishing descriptions of data, and capturing computational provenance); "funding" (identifying funding sources for curation support); and "planning" (creating data management plans at proposal time). When the survey results are thus categorized, the dominance of storage is clear, with over 80% of respondents requesting some type of storage-related help. (This number may also reflect a general equating of curation with storage on the part of respondents.) Slightly fewer than 50% of respondents requested help related to metadata, a result explored in more detail below.

      Storage is a broad topic and is a very frequently mentioned topic in all of the University-run surveys.

      http://www.alexandria.ucsb.edu/~gjanee/dc@ucsb/survey/plots/q4.2.png

      Highlight by Chris during today's discussion.

    5. Distribution of departments with respect to responsibility spheres. Ignoring the "Myself" choice, consider clustering the parties potentially responsible for curation mentioned in the survey into three "responsibility spheres": "local" (comprising lab manager, lab research staff, and department); "campus" (comprising campus library and campus IT); and "external" (comprising external data repository, external research partner, funding agency, and the UC Curation Center). Departments can then be positioned on a tri-plot of these responsibility spheres, according to the average of their respondents' answers. For example, all responses from FeministStds (Feminist Studies) were in the campus sphere, and thus it is positioned directly at that vertex. If a vertex represents a 100% share of responsibility, then the dashed line opposite a vertex represents a reduction of that share to 20%. For example, only 20% of ECE's (Electrical and Computer Engineering's) responses were in the campus sphere, while the remaining 80% of responses were evenly split between the local and external spheres, and thus it is positioned at the 20% line opposite the campus sphere and midway between the local and external spheres. Such a plot reveals that departments exhibit different characteristics with respect to curatorial responsibility, and look to different types of curation solutions.

      This section contains an interesting diagram showing the distribution of departments with respect to responsibility spheres:

      http://www.alexandria.ucsb.edu/~gjanee/dc@ucsb/survey/plots/q2.5.png

    6. In the course of your research or teaching, do you produce digital data that merits curation? 225 of 292 (77%) of respondents answered "yes" to this first question, which corresponds to 25% of the estimated population of 900 faculty and researchers who received the survey.

      For those who do not feel they have data that merits curation I would at least like to hear a description of the kinds of data they have and why they feel it does not need to be curated?

      For some people they may already be using well-curated data sets; on the other hand there are some people who feel their data may not be useful to anyone outside their own research group, so there is no need to curate the data for use by anyone else even though under some definition of "curation" there may be important unmet curation needs for internal-use only that may be visible only to grad students or researchers who work with the data hands-on daily.

      UPDATE: My question is essentially answered here: https://hypothes.is/a/xBpqzIGTRaGCSmc_GaCsrw

    7. Responsibility, myself versus others. It may appear that responses to the question of responsibility are bifurcated between "Myself" and all other parties combined. However, respondents who identified themselves as being responsible were more likely than not to identify additional parties that share that responsibility. Thus, curatorial responsibility is seen as a collaborative effort. (The "Nobody" category is a slight misnomer here as it also includes non-responses to this question.)

      This answers my previous question about this survey item:

      https://hypothes.is/a/QrDAnmV8Tm-EkDuHuknS2A

    8. Awareness of data and commitment to its preservation are two key preconditions for successful data curation.

      Great observation!

    9. Which parties do you believe have primary responsibility for the curation of your data? Almost all respondents identified themselves as being personally responsible.

      For those that identify themselves as personally responsible would they identify themselves (or their group) as the only ones responsible for the data? Or is there a belief that the institution should also be responsible in some way in addition to themselves?

    10. Availability of the raw survey data is subject to the approval of the UCSB Human Subjects Committee.
    11. Survey design The survey was intended to capture as broad and complete a view of data production activities and curation concerns on campus as possible, at the expense of gaining more in-depth knowledge.

      Summary of the survey design

    12. Researchers may be underestimating the need for help using archival storage systems and dealing with attendant metadata issues.

      In my mind this is a key challenge: even if people can describe what they need for themselves (that in itself is a very hard problem), what to do from the infrastructure standpoint to implement services that aid the individual researcher and also aid collaboration across individuals in the same domain, as well as across domains and institutions... in a long-term sustainable way is not obvious.

      In essence... how do we translate needs that we don't yet fully understand into infrastructure with low barrier to adoption, use, and collaboration?

    13. Researchers view curation as a collaborative activity and collective responsibility.
    14. To summarize the survey's findings: Curation of digital data is a concern for a significant proportion of UCSB faculty and researchers. Curation of digital data is a concern for almost every department and unit on campus. Researchers almost universally view themselves as personally responsible for the curation of their data. Researchers view curation as a collaborative activity and collective responsibility. Departments have different curation requirements, and therefore may require different amounts and types of campus support. Researchers desire help with all data management activities related to curation, predominantly storage. Researchers may be underestimating the need for help using archival storage systems and dealing with attendant metadata issues. There are many sources of curation mandates, and researchers are increasingly under mandate to curate their data. Researchers under curation mandate are more likely to collaborate with other parties in curating their data, including with their local labs and departments. Researchers under curation mandate request more help with all curation-related activities; put another way, curation mandates are an effective means of raising curation awareness. The survey reflects the concerns of a broad cross-section of campus.

      Summary of survey findings.

    15. In 2012 the Data Curation @ UCSB Project surveyed UCSB campus faculty and researchers on the subject of data curation, with the goals of 1) better understanding the scope of the digital curation problem and the curation services that are needed, and 2) characterizing the role that the UCSB Library might play in supporting curation of campus research outputs.

      1) better understanding the scope of the digital curation problem and the curation services that are needed

      2) characterizing the role that the UCSB Library might play in supporting curation of campus research outputs.

    1. In all cases, one standard is clear: Each of these vendors is betting very heavily on HTML5-based applications as well as methods to make HTML5 compatibility the basis for their future. Whether made from Java or other language frameworks, HTML5 is the common thread that runs through each of these alternative mobile operating systems. Start with HTML5, and your applications’ portability is almost assured.

      Is there any other reasonable bet than HTML5?! Especially in the smartdevice realm where there is a rich set of HTML5-family features already enabled it makes less and less sense to make native applications except for special edge-cases. And any smartdevice competitors cannot possibly compete against iOS and Android on their own unique native app development format-- so HTML5 would seem the only reasonable place to focus development of new apps. Where Ubuntu succeeds is compatibility with an already well established Linux ecosystem.

    1. the parties, the procedural posture, the facts, the issue , the h olding, and the analysis.

      Parts of a judicial opinion identified in a student brief:

      • parties
      • procedural posture
      • facts
      • issues
      • holding
      • analysis
    2. When a law student briefs a case, he typically identifies several pieces of information: the parties, the procedural posture, the facts, the issue , the h olding, and the analysis. Although it seems foreign at first, identifying this information, understanding judicial opinions , and applying their reasoning to new cases becomes much easier with practice.

      The legal brief described here is a student brief, not to be confused with an appellate brief; the distinction is described in more detail in How To Brief a Case.

    3. the judge will state the legal issue(s) involved, her decision about the issue s (the holding) , and her reasoning.

      the holding is a part of a judicial opinion that states the decision about the legal issues involved in a case.

    4. H o w t o R e a d O p i n i o n s

      This section on how to read judicial opinions helpfully describes the components of what an opinion contains and some discussion of the challenges in identifying those components within the structure of the opinion.

      The components identified here are:

      • caption/name of parties
      • name of the court
      • date of the opinion
      • date of oral arguments in appellate cases
      • citation information
      • name of judge(s) who wrote the opinion
      • case history
      • procedural posture (stage at which opinion was issued)
      • information about facts of the case (especially for trial court opinions)
      • statement of legal issues involved
      • the holding (decision about the issues)
      • the judge's reasoning
    5. The opinion will also typically give the name of the judge or justice who wrote it. In some cases, judges sitting together will decide not to reveal wh o wrote an opinion. In that situation, it will say p e r c u r i a m /DWLQIRU³E\WKHFRXUW ́ ) i QSODFHRIDMXGJH¶VQDPH

      The garbled text quoted here should be:

      it will say per curiam (Latin for "by the court") in place of a judge's name.

    6. In a judicial opinion, the judge explains her ruling and the reasoning behind it. At its heart, an opinion is similar to a scholarly essay or even a short story. However, like any genre, the judicial opinion has some unique and unusual characteristics.

      The purpose of a judicial opinion is to explain the ruling and the reasoning behind it.

    1. Student brief A student brief is a short summary and analysis of the case prepared for use in classroom discussion. It is a set of notes, presented in a systematic way, in order to sort out the parties, identify the issues, ascertain what was decided, and analyze the reasoning behind decisions made by the courts. Although student briefs always include the same items of information, the form in which these items are set out can vary. Before committing yourself to a particular form for briefing cases, check with your instructor to ensure that the form you have chosen is acceptable.
    2. Appellate brief An appellate brief is a written legal argument presented to an appellate court. Its purpose is to persuade the higher court to uphold or reverse the trial court’s decision. Briefs of this kind are therefore geared to presenting the issues involved in the case from the perspective of one side only. Appellate briefs from both sides can be very valuable to anyone assessing the legal issues raised in a case. Unfortunately, they are rarely published. The U.S. Supreme Court is the only court for which briefs are regularly available in published form. The Landmark Briefs series (REF. LAW KF 101.9 .K8) includes the full texts of briefs relating to a very few of the many cases heard by this court. In addition, summaries of the briefs filed on behalf of the plaintiff or defendant for all cases reported are included in the U.S. Supreme Court Reports. Lawyer’s Ed., 2nd. series (REF. LAW KF 101 .A42).
    3. Confusion often arises over the term “legal brief.” There are at least two different senses in which the term is used.

      Two different sense of the term legal brief are described here: appellate brief and student brief.

    1. This suggests that peer production will thrive where projects have three characteristi cs

      If thriving is a metric (is it measurable? too subjective?) of success then the 3 characteristics it must have are:

      • modularity: divisible into components
      • granularity: fine-grained modularity
      • integrability: low-cost integration of contributions

      I don't dispute that these characteristics are needed, but they are too general to be helpful, so I propose that we look at these three characteristics through the lens of the type of contributor we are seeking to motivate.

      How do these characteristics inform what we should focus on to remove barriers to collaboration for each of these contributor-types?

      Below I've made up a rough list of lenses. Maybe you have links or references that have already made these classifications better than I have... if so, share them!

      Roughly here are the classifications of the types of relationships to open source projects that I commonly see:

      • core developers: either hired by a company, foundation, or some entity to work on the project. These people care most about integrability.

      • ecosystem contributors: someone either self-motivated or who receives a reward via some mechanism outside the institution that funds the core developers (e.g. reputation, portfolio for future job prospects, tools and platforms that support a consulting business, etc). These people care most about modularity.

      • feature-driven contributors: The project is useful out-of-the-box for these people and rather than build their own tool from scratch they see that it is possible for the tool to work they way they want by merely contributing code or at least a feature-request based on their idea. These people care most about granularity.

      The above lenses fit the characteristics outlined in the article, but below are other contributor-types that don't directly care about these characteristics.

      • the funder: a company, foundation, crowd, or some other funding body that directly funds the core developers to work on the project for hire.

      • consumer contributors: This class of people might not even be aware that they are contributors, but simply using the project returns direct benefits through logs and other instrumented uses of the tool to generate data that can be used to improve the project.

      • knowledge-driven contributors: These contributors are most likely closest to the ecosystem contributors, maybe even a sub-species of those, that contribute to documentation and learning the system; they may be less-skilled at coding, but still serve a valuable part of the community even if they are not committing to the core code base.

      • failure-driven contributors: A primary source of bug reports and may also be any one of the other lenses.

      What other lenses might be useful to look through? What characteristics are we missing? How can we reduce barriers to contribution for each of these contributor types?

      I feel that there are plenty of motivations... but what barriers exist and what motivations are sufficient for enough people to be willing to surmount those barriers? I think it may be easier to focus on the barriers to make contributing less painful for the already-convinced, than to think about the motivators for those needing to be convinced-- I think the consumer contributors are some of the very best suited to convince the unconvinced; our job should be to remove the barriers for people at each stage of community we are trying to build.

      A note to the awesome folks at Hypothes.is who are reading our consumer contributions... given the current state of the hypothes.is project, what class of contributors are you most in need of?

    2. the proposition that diverse motivations animate human beings, and, more importantly, that there exist ranges of human experience in which the presence of monetary rewards is inversely related to the presence of other, social-psychological rewards.

      The first analytic move.

    3. common appropriation regimes do not give a complete answer to the sustainability of motivation and organization for the truly open, large-scale nonproprietary peer production projects we see on the Internet.

      Towards the end of our last conversation the text following "common appropriation" seemed an interesting place to dive into further for our future discussions.

      I have tagged this annotation with "meta" because it is a comment about our discussion and where to continue it rather than an annotation focused on the content itself.

      In the future I would be interested in exploring the idea of "annotation types" that can be selectively turned on and off, but for now will handle that with ad hoc tags like "meta".

    4. The following selection from The Yale Law Journal is not paginated and should not be used for citation purposes.

      Note that this disclaimer only says the document should not be used for citation purposes, but doesn't say we can't use it for annotation purposes like testing out the Chrome PDF.js + Hypothes.is extension! :)

      You can install the extension from the Chrome Web Store with this link:

      https://chrome.google.com/webstore/detail/pdfjs-%2B-hypothesis/bipacimpfefoidapjkknffflfpfmjdog/related

    5. understanding that when a project of any size is broken up into little pieces, each of which can be performed by an individual in a short amount of time, the motivation to get any given individual to contribute need only be very small.

      The second analytic move.

    1. I call this new kind of bad roadway behavior vague driving. It's as if people are sort of surprised to find themselves behind the wheel of a car. They're slightly puzzled by the experience. They may even be a little intimidated by the car. I'm not sure.

      Such an awesome description... and, sadly, apt. I would much rather ride my motorcycle alongside as still-being-tested Google autonomous vehicle than these "vague drivers".

    1. Once you abandon entirely the crazy idea that the type of a value has anything whatsoever to do with the storage, it becomes much easier to reason about it. Of course, my point above stands: you don't need to reason about it unless you are writing unsafe code or doing some sort of heavy interoperating with unmanaged code. Let the compiler and the runtime manage the lifetime of your storage locations; that's what its good at.

      Understanding what you should (and should not) reason about in the language you are using is an important part of good programming; and a language that lets you reason (nee worry) about only the things you need to worry about is an important part of a good programming language.

    2. There are three kinds of storage locations: stack locations, heap locations, and registers.
    3. There are three kinds of values: (1) instances of value types, (2) instances of reference types, and (3) references. (Code in C# cannot manipulate instances of reference types directly; it always does so via a reference. In unsafe code, pointer types are treated like value types for the purposes of determining the storage requirements of their values.)
    4. Having made these points many times in the last few years, I've realized that the fundamental problem is in the mistaken belief that the type system has anything whatsoever to do with the storage allocation strategy. It is simply false that the choice of whether to use the stack or the heap has anything fundamentally to do with the type of the thing being stored. The truth is: the choice of allocation mechanism has to do only with the known required lifetime of the storage.

      The type system has nothing to do with the storage allocation strategy; the choice of allocation mechanism has to do only with the known required lifetime of the storage.

    1. A lot of people seem to think that heap allocation is expensive and stack allocation is cheap. They are actually about the same, typically. It’s the deallocation costs – the marking and sweeping and compacting and moving memory from generation to generation – that are massive for heap memory compared to stack memory.
    2. Now compare this to the stack. The stack is like the heap in that it is a big block of memory with a “high water mark”. But what makes it a “stack” is that the memory on the bottom of the stack always lives longer than the memory on the top of the stack; the stack is strictly ordered. The objects that are going to die first are on the top, the objects that are going to die last are on the bottom. And with that guarantee, we know that the stack will never have holes, and therefore will not need compacting. We know that the stack memory will always be “freed” from the top, and therefore do not need a free list. We know that anything low-down on the stack is guaranteed alive, and so we do not need to mark or sweep.
    3. This sketch is complicated by the fact that there are actually three such arenas; the CLR collector is generational. Objects start off in the “short lived” heap. If they survive they eventually move to the “medium lived” heap, and if they survive there long enough, they move to the “long lived” heap. The GC runs very often on the short lived heap and very seldom on the long lived heap; the idea is that we do not want to have the expense of constantly re-checking a long-lived object to see if it is still alive. But we also want short-lived objects to be reclaimed swiftly.
    4. When a garbage collection is performed there are three phases: mark, sweep and compact. In the “mark” phase, we assume that everything in the heap is “dead”. The CLR knows what objects were “guaranteed alive” when the collection started, so those guys are marked as alive. Everything they refer to is marked as alive, and so on, until the transitive closure of live objects are all marked. In the “sweep” phase, all the dead objects are turned into holes. In the “compact” phase, the block is reorganized so that it is one contiguous block of live memory, free of holes.
    5. If we’re in that situation when new memory is allocated then the “high water mark” is bumped up, eating up some of the previously “free” portion of the block. The newly-reserved memory is then usable for the reference type instance that has just been allocated. That is extremely cheap; just a single pointer move, plus zeroing out the newly reserved memory if necessary.
    6. The idea is that there is a large block of memory reserved for instances of reference types. This block of memory can have “holes” – some of the memory is associated with “live” objects, and some of the memory is free for use by newly created objects. Ideally though we want to have all the allocated memory bunched together and a large section of “free” memory at the top.
    1. I regret that the documentation does not focus on what is most relevant; by focusing on a largely irrelevant implementation detail, we enlarge the importance of that implementation detail and obscure the importance of what makes a value type semantically useful. I dearly wish that all those articles explaining what “the stack” is would instead spend time explaining what exactly “copied by value” means and how misunderstanding or misusing “copy by value” can cause bugs.

      Documentation should focus on semantically useful descriptions; another accompanying document (or annotation) can provide relevant implementation details upon request, but that deeper level of detail should be left out by default to avoid enlarging the importance of less relevant things.

    2. I find this characterization of a value type based on its implementation details rather than its observable characteristics to be both confusing and unfortunate
    3. Surely the most relevant fact about value types is not the implementation detail of how they are allocated, but rather the by-design semantic meaning of “value type”, namely that they are always copied “by value”.
    4. I blogged a while back about how “references” are often described as “addresses” when describing the semantics of the C# memory model. Though that’s arguably correct, it’s also arguably an implementation detail rather than an important eternal truth. Another memory-model implementation detail I often see presented as a fact is “value types are allocated on the stack”. I often see it because of course, that’s what our documentation says.
    1. Additional broader impacts will emerge from analyses of the whooping crane dataset. Through collaborations with endangered species biologists in the US Geological Survey, these analyses will have direct relevance to specific management actions for the whooping crane, such as the timing, group size, and composition of crane reintroductions and potentially their training with ultra-light aircraft.

      Broader impact for management of endangered species

    2. The project will develop an analysis package in the open-source language R and complement it with a step-by-step hands-on manual to make tools available to a broad, international user community that includes academics, scientists working for governments and non-governmental organizations, and professionals directly engaged in conservation practice and land management. The software package will be made publicly available under http://www.clfs.umd.edu/biology/faganlab/movement/.

      Output of the project:

      • analysis package written in R
      • step-by-step hands-on manual
      • make tools available to a broad, international community
      • software made publicly available

      Question: What software license will be used? The Apache software license is potentially a good choice here because it is a strong open source license supported by a wide range of communities with few obligations or barriers to access/use which supports the goal of a broad international audience.

      Question: Will the data be made available under a license, as well? Maybe a CC license of some sort?

    3. These species represent not only different types of movement (on land, in air, in water) but also different types of relocation data (from visual observations of individually marked animals to GPS relocations to relocations obtained from networked sensor arrays).

      Movement types:

      • land
      • air
      • water

      Types of relocation data:

      • visual observations
      • GPS
      • networked sensor arrays
    4. For example, by statistically analyzing the interrelationships of relocation data among individuals, it will be possible to distinguish and quantify population-level movement patterns such as migration, range residency, and nomadism.

      Quantifying movement patterns at the population-level:

      • migration
      • range residency
      • nomadism
    5. intra-individual concordance

      Are there examples of this kind of data product at scale?

    6. This project will develop new and innovative data management and analysis tools focusing on the interrelationship of multiple moving individuals. These include measures that calculate 1) realized mobility (quantifying the relationship of individual to population ranges), 2) population dispersion (quantifying the spatial relationship among individuals), 3) movement coordination (quantifying the coordination of movements among individuals), and 4) intra-individual concordance (quantifying the spatial relationship of relocations of individuals over time). These innovative ways of treating animal movement data will allow researchers to investigate a broad range of new research questions.

      1) Realized mobility: Relationship of individual to population ranges. 2) Population dispersion: Spatial relationship among individuals. 3) Movement coordination: Coordination of movements among individuals. 4) Intra-individual concordance: Spatial relationship of relocations of individuals over time.

    7. but scientists' understanding of the emergent spatial dynamics at the population level has not kept pace, in large part due to an absence of appropriate tools for data handling and statistical analysis.

      Tools gap needs to be filled to improve understanding of emergent spatial dynamics.

    8. A grant is awarded to University of Maryland, College Park to develop informatics tools that allow scientists and conservation managers to use animal relocation and tracking data to study movement processes at the population level.
    1. NSF Advances in Biological Informatics: "Informatics tools for population-level animal movements." with T. Mueller, P. Leimgruber, A. Royle, and J. Calabrese. Thomas Mueller, an Assistant Research Scientist in my lab, leads this project. Also on this grant, postdoc Chris Fleming is investigating theoretical aspects of animal foraging and statistical issues associated with empirical data on animal movements. This project is developing innovative data management and analysis tools that will allow scientists and conservation managers to use animal relocation and tracking data to study movement processes at the population-level, focusing on the interrelationship of multiple moving individuals. We are developing and testing these new tools using datasets on Mongolian gazelles, whooping cranes, and blacktip sharks. More information is available on the Movement Dynamics Homepage.
    1. My project seeks to develop computer models that simulate and link behavioral movement mechanisms which can be either based on memory, perceptual cues or triggered by environmental factors. It explores their efficiency under different scenarios of resource distributions across time and space. Finally it tries to integrate empirical data on resource distributions as well as movements of moving animals, such as satellite data on primary productivity and satellite tracking data of Mongolian gazelles.
    2. News Thomas Mueller and Bill Fagan receive a new NSF Bioinformatics grant Collaborators Peter Leimgruber Smithsonian Institution Volker Grimm Centre for Environmental Research - UFZ, Leipzig Kirk A. Olson University of Massachusetts Todd K. Fuller University of Massachusetts George B. Schaller Wildlife Conservation Society Nuria Selva Institute of Nature Conservation, Krakow

      Collaborators

      • Peter Leimgruber, Smithsonian Institution
      • Volker Grimm, Centre for Environmental Research - UFZ, Leipzig
      • Kirk A. Olson, University of Massachusetts
      • Todd K. Fuller, University of Massachusetts
      • George B. Schaller, Wildlife Conservation Society
      • Nuria Selva, Institute of Nature Conservation, Krakow
    1. In the Middle Ages, just the opposite was true. Reading was generally done aloud, often to an audience. It was an active process, so active that Susan Noakes, in her analysis of medieval reading, points out “that it had been recommended by physicians, since classical times, as a mild form of exercise, like walking.”

      Reading in the Middle Ages considered a mild form of exercise.

    2. that’s the strange thing about writing, which makes it truly analogous to painting. The painter’s products stand before us as if they were alive, but if you question them, they maintain a most majestic silence. It is the same with written words; they seem to talk to you as if they were intelligent, but if you ask them anything about what they say, from a desire to be instructed, they go on telling you just the same thing forever.

      Writing analogous to painting

    3. Socrates was concerned with reflective thought: the ability to think deeply about things, to question and examine every statement. He thought that reading was experiential, that it would not lead to reflection.
    4. Questioning and examination are the tools of reflection: Hear an idea, ponder it, question it, modify it, explore its limitations. When the idea is presented by a person, the audience can interrupt, ask questions, probe to get at the underlying assumptions. But the author doesn’t come along with a book, so how could the book be questioned if it couldn’t answer back? This is what bothered Socrates.

      This is what bothered socrates.

    5. Socrates, Plato tells us, argued that books would destroy thought.

      Books as destroyers of thought

    1. Creating an atlas is more encompassing than image acquisition and analysis. It requires a clear understanding of the biological questions to be addressed. Then appropriate labeling, sample preparation, imaging, image analysis, visualization, and data management methods must be selected (Figure 2). An interdisciplinary team is required that collectively possess the needed expertise. Generating useful atlases is still in its infancy. Which methods to use at each step along the pipeline will depend greatly on what analysis is required. There is currently no ‘magic toolbox’ that scientists can use to apply to their specific task. Each step has to be tailored to suit the experiment.

      Atlases are more than just image acquisition and analysis.

      An interdisciplinary team is required that collectively possesses the needed expertise.

      There is no "magic toolbox"

    2. The database should have an associated web site for access by internal researchers working on atlas construction and quite likely a separate Web site for public access to published datasets.

      Need for web site access by internal researchers vs web-based public access to published datasets

    3. The creation and exploitation of large-scale quantitative atlases will lead to a more precise understanding of development.

      large-scale quantitative atlases lead to more precise understanding

    4. The results of the subsequent mathematical analysis can often be exported as additional rows and columns into an updated version of the atlas and then explored by the biologist using the visualization tool.

      Blending mathematical and visual analysis

    5. This exploration may itself lead to novel discoveries, but will also help the biologist better understand the quality and nature of the dataset, improving his or her ability to suggest analyses to computational colleagues.

      Exploration may lead to novel discoveries

    6. The challenge is that while biologists best understand the questions that can be addressed using the atlas, they may not always possess the computational and mathematical skills needed to conduct sophisticated analyses of such data files. For this reason, biologists generally collaborate with computational scientists. It is not always clear, though, what is the best way to frame the analysis.

      1) The challenge 2) Literacy 3) Framing the analysis

    7. Once a searchable atlas has been constructed there are fundamentally two approaches that can be used to analyze the data: one visual, the other mathematical.
    8. The initial inputs for deriving quantitative information of gene expression and embryonic morphology are raw image data, either of fluorescent proteins expressed in live embryos or of stained fluorescent markers in fixed material. These raw images are then analyzed by computational algorithms that extract features, such as cell location, cell shape, and gene product concentration. Ideally, the extracted features are then recorded in a searchable database, an atlas, that researchers from many groups can access. Building a database with quantitative graphical and visualization tools has the advantage of allowing developmental biologists who lack specialized skills in imaging and image analysis to use their knowledge to interrogate and explore the information it contains.

      1) Initial input is raw image data 2) feature extraction on raw image data 3) extracted features stored in shared, searchable database 4) database available to researchers from many groups 5) quantitative graphical and visualization tools allow access to those without specialized skill in imaging and image analysis

    9. approaches to establish permanent, quantitative datasets—atlases
    10. A rigorous understanding of these developmental processes requires automated methods that quantitatively record and analyze complex morphologies and their associated patterns of gene expression at cellular resolution.

      Rigorous understanding requires automated methods using quantitative recording and analysis.

    11. Just as comprehensive datasets of genomic sequence have revolutionalized biological discovery, large-scale quantitative measurements of gene expression and morphology will certainly be of great assistance in enabling computational embryology in the future. Such datasets will form the essential basis for systems level, computational models of molecular pathways and how gene expression concentrations and interactions alter to drive changes in cell shape, movement, connection, and differentiation. In this review, we discuss the strategies and methods used to generate such datasets.
    12. These large-scale quantitative data provide new insights that could not have been gained through qualitative analyses.

      Quantitative Data vs Qualitative Analysis

    13. Qualitative statements describe in a yes/no manner for example, which tissues a gene is expressed in or if two groups of cells move relative to one another. This basic information is insufficient, though, to address many fundamental questions in developmental biology.
    1. Difference between XZ and LZMA2 Short answer: xz is a format that (currently) only uses the lzma2 compression algorithm. Long answer: think of xz as a container for the compression data generated by the lzma2 algorithm. We also have this paradigm for video files for example: avi/mkv/mov/mp4/ogv are containers, and xvid/x264/theora are compression algorithms. The confusion is often made because currently, the xz format only supports the lzma2 algorithm (and it’ll remain the default, even if some day, others algorithms may be added). This confusion doesn’t happen with other formats/algorithms, as for example gzip is both a compression algorithm and a format. To be exact, the gzip format only supports to encapsulate data generated by gzip… the compression algorithm. In this article I’ll use “xz” to say “the lzma2 algorithm whose data is being encapsulated by the xz format”. You’ll probably agree it’s way simpler

      The key here is the notion of a format as a container. Lots of content is moving towards that notion-- that a "file" is really an opaque (to the OS filesystem) directory or container of some sort and some other program understands the format of the "file" as a container to know how to open it to access the files inside.

    1. at its core, DevOps is about culture
    2. The true essence of DevOps is empathy.
    3. Information exchange requires (and can contribute to) mutual understanding; e.g., empathy.

      What is the essence of empathy? Is it mutual understanding? What else is there?

    4. I was surprised to encounter empathy again in the context of cybernetics. This rediscovery happened thanks to a Twitter exchange with @seungchan​. Cybernetics tells us that, in order for any one or any thing to function, it must have a relationship with other people and/or things.
    5. I first encountered empathy as an explicit design principle in the context of design thinking. You can’t design anything truly useful unless you understand the people for whom you’re designing.

      Empathy as a design principle

    1. Is it because ops care deeply about systems while devs consider them a tool or implementation detail?

      What is the divide?

    2. The closest thing to common ground may be events for configuration-management software like PuppetConf or ChefConf, or possibly re:Invent.
    3. When I look at the DevOps “community” today, what I generally see is a near-total lack of overlap between people who started on the dev side and on the ops side.

      I see this same near-total lack of overlap. There is a different language, mindset, and approach.

    4. a China Miéville novel called The City & the City. It’s about two cities that literally overlap in geography, with the residents of each completely ignoring the other — and any violations, or breaches, of that separation are quickly enforced by a shadowy organization known as the Breach.
    1. Until academics get their acts together and start using new modes of publication, we need to recognize that actions like Aaron Swartz's civil disobedience are legitimate.

      Justification for civil disobedience

    2. Aaron Swartz's act of hacktivism was an act of resistance to a corrupt system that has subverted distribution of the most important product of the academy—knowledge.
    3. the philosophy department at the University of Michigan at Ann Arbor started an online journal called Philosophers' Imprint, noting in its mission statement the possibility of a sunnier alternative: "There is a possible future in which academic libraries no longer spend millions of dollars purchasing, binding, housing, and repairing printed journals, because they have assumed the role of publishers, cooperatively disseminating the results of academic research for free, via the Internet. Each library could bear the cost of publishing some of the world's scholarly output, since it would be spared the cost of buying its own copy of any scholarship published in this way. The results of academic research would then be available without cost to all users of the Internet, including students and teachers in developing countries, as well as members of the general public."

      Libraries as publishers. Not a bad idea.

    4. And JSTOR really was in an impossible bargaining position. Important scientific papers do not have cheaper alternatives. If someone wants to read Watson and Crick's paper on DNA or Einstein's paper on the photoelectric effect, it is not as if there is a paper by John Doe that is just as good and available for less. Academic publishers are, in effect, natural monopolies that can demand as much money as we can afford, and possibly more.
    5. But like the original authors, JSTOR had to negotiate its licensing agreements from a position of weakness. There is a wonderful history of JSTOR written by Roger C. Schonfeld. In it he notes that the charter publishers signed up by JSTOR (in particular the University of Chicago Press) demanded that they be compensated if there was a loss to their (minimal) sales of rights to older materials, and they demanded compensation even before JSTOR covered its own expenses.
    6. JSTOR, which did not pursue criminal charges against Swartz and "regretted being drawn into" the U.S. attorney's case against him, came into existence in 1995 with good intentions. It sought a solution to the rapidly expanding problem of paying for and storing an ever-growing list of academic journals. The situation for libraries was becoming untenable.
    7. Academic publishers have inverted their whole purpose for being; they used to be vehicles for the dissemination of knowledge in the most efficient way possible. Today they are useless choke points in the distribution of knowledge, even taking advantage of their positions to demand fees.
    8. There was a time when securing a contract with an academic publisher meant that the work would receive the widest audience possible.
    9. a "contract of adhesion"—meaning a contract in which one party has all the power and it was not freely bargained.
    10. If you don't publish, you won't get tenure. Even if you have tenure, your reputation (and salary) is staked to your publication record.
    11. The consensus so far has been that Swartz did something wrong by accessing and releasing millions of academic papers from the JSTOR archive.

      Background on the Aaron Swartz act of civil disobedience:

      http://about.jstor.org/news/jstor-statement-misuse-incident-and-criminal-case

    12. He laid the philosophical groundwork back in 2008, in an essay entitled "Guerilla Open Access Manifesto."
    13. The academic publisher Elsevier has contributed to many U.S. Congressional representatives, pushing the Elsevier-supported Research Works Act, which among other things would have forbidden any effort by any federal agency to ensure taxpayer access to work financed by the federal government without permission of the publisher.

      What other legislation has Elsevier pushed?

    14. "There is no justice in following unjust laws. It's time to come into the light and, in the grand tradition of civil disobedience, declare our opposition to this private theft of public culture."

      Civil disobedience is indeed a grand tradition... and what are the conditions under which to declare it is time to act again?

    1. Server error, there was an error while handling your request. Administrators have been notified, please try reloading the page.

      Annotating errors

    1. We regularly provide scholars with access to content for this purpose. Our Data for Research site (http://dfr.jstor.org)

      The access to this is exceedingly slow. Note that it is still in beta.

    2. The criminal investigation and today’s indictment of Mr. Swartz has been directed by the United States Attorney’s Office. It was the government’s decision whether to prosecute, not JSTOR’s. As noted previously, our interest was in securing the content. Once this was achieved, we had no interest in this becoming an ongoing legal matter.

      How was this initiated?

    1. The Harvard Business Review has been writing about the benefits of cultures of gratitude in the workplace.

      Great example to start with. Following the link http://blogs.hbr.org/2013/04/foster-a-culture-of-gratitude/ to read this (short) article was worthwhile, as well as following the link in that article to another one about How to Give a Meaningful Thanks: http://blogs.hbr.org/2013/02/how-to-give-a-meaningful-thank/

    1. more than half of all employees intended to search for new jobs because they felt underappreciated and undervalued.

      I have felt this in my own personal experiences... but it's not that I didn't feel valued by my bosses, because I actually very much have felt valued by most of them, however what was missing was the culture of gratitude, because it wasn't enough to know that I felt valued. I want to know that I am part of a team where the other members feel valued from above by the bosses and by each other.

    2. Jon R. Katzenbach and Douglas K. Smith, authors of the Wisdom of Teams, define a high-performing team in part by members’ strong personal commitment to the growth and success of each team member and of the team as a whole.

      Strong personal commitment stands out as a key idea for me here... and specifically that it is not just focused on self, but on the growth and success of everyone in the team as a whole.

    3. High performing teams have well-defined goals, systems of accountability, clear roles and responsibilities, and open communication.

      I feel these are good defining characteristics of high-performing teams.

    4. Support camaraderie and collegiality

      Climate change-- fostering a workplace environment that celebrates compassionate communication.

    5. Involve employees

      I've learned again and again that I can't just do it all myself... but importantly, when I involve other people it makes my ideas even better (and often rescues me when my ideas didn't start out so great!)

    6. Help others develop

      Step 1 for cultivating a culture of gratitude.

    7. Several recent articles point out the importance of saying “thank you” and giving specific praise to employees when earned in genuine, honest, and heartfelt ways. Mark Gaston’s blog on How to Give a Meaningful Thank-you is full of great advice such as sharing with employees how their contributions had personal significance for the leader and team.

      I am glad I took the time to read "How to Give a Meaningful Thank-you"; the article resonates deeply with me. I feel good that I actively engage in those meaningful thank-yous with people, but I also see where and how I can do that more, too.

    1. Tell them what it personally meant to you.

      What was the impact to me? Share that with someone so they know that I felt it was important and specifically how or why to me personally.

    2. Acknowledge to them the effort (or personal sacrifice) that they made in doing the above.

      Recognize and voice that you see what they've given up.

    3. Thank them for something they specifically did that was above the call of duty.

      It's important to know what it takes to "exceed expectations". Does working hard and then working even harder for the same outcome go above the call of duty? Or does the outcome matter? Whatever the answer is, being specific in the thanks is important to communicate what you think the answer is.

    4. So take action now. Give that person what I call a Power Thank You. This has three parts

      I like articles and blog posts like this that have a call to action with a specific example of the action.

    5. research by Adam Grant and Francesca Gino has shown that saying thank you not only results in reciprocal generosity — where the thanked person is more likely to help the thanker — but stimulates prosocial behavior in general. In other words, saying “thanks” increases the likelihood your employee will not only help you, but help someone else.

      Reciprocal generosity... keystone habits

    6. They are a person deserving of your not infrequent acknowledgment and worthy of appreciation and respect. When was the last time you thanked them — really thanked them?

      Basic dignity and respect-- a good thing, indeed. We need more of that.

    7. research by Adam Grant and Francesca Gino has shown that saying thank you not only results in reciprocal generosity — where the thanked person is more likely to help the thanker — but stimulates prosocial behavior in general. In other words, saying “thanks” increases the likelihood your employee will not only help you, but help someone else.

      Good things generate more good things

    8. too often, they begin to view and treat their teams, and especially their assistants, as appliances

      Unfortunate and dehumanizing. :(

    9. So when I wrote to her boss, I included this: “When I get to be rich, I’m going to hire someone like your assistant — to protect me from people like me. She was helpful, friendly, feisty vs. boring and yet guarded access to you like a loyal pit bull. If she doesn’t know how valuable she is to you, you are making a big managerial mistake and YOU should know better.”

      Evocative

    1. How we meet this challenge depends on how we address the following fundamental question about teaching our digital-age children: Should we teach our children as though they have two lives, or one?

      two lives or one? Also, what about two names? A public name and a private name, as some cultures already have where only your friends call you by a certain name that others might not know.

    2. The tie that binds us to our ancestors is that both ancient and digital-age humans crave community—and all the things that make community possible: survival, effective communication, cultural stability, purposeful education for our children, and creative expression.
    1. But we were surprised that an unadorned set of 127 slides—no music, no animation—would become so influential.

      Maybe that's why it became influential? It didn't need all of that junk to keep people interested... the content alone did the job.

    1. Here’s a simple test: If your company has a performance bonus plan, go up to a random employee and ask, “Do you know specifically what you should be doing right now to increase your bonus?” If he or she can’t answer, the HR team isn’t making things as clear as they need to be.
    2. Instead of cheerleading, people in my profession should think of themselves as businesspeople. What’s good for the company? How do we communicate that to employees? How can we help every worker understand what we mean by high performance?
    1. I frequently see CEOs who are clearly winging it. They lack a real agenda. They’re working from slides that were obviously put together an hour before or were recycled from the previous round of VC meetings. Workers notice these things, and if they see a leader who’s not fully prepared and who relies on charm, IQ, and improvisation, it affects how they perform, too. It’s a waste of time to articulate ideas about values and culture if you don’t model and reward behavior that aligns with those goals.
    2. We continually told managers that building a great team was their most important task. We didn’t measure them on whether they were excellent coaches or mentors or got their paperwork done on time. Great teams accomplish great work, and recruiting the right team was the top priority.
    3. We distributed options every month, at a slight discount from the market price. We had no vesting period—the options could be cashed in immediately. Most tech companies have a four-year vesting schedule and try to use options as “golden handcuffs” to aid retention, but we never thought that made sense. If you see a better opportunity elsewhere, you should be allowed to take what you’ve earned and leave. If you no longer want to work with us, we don’t want to hold you hostage.
    4. We also believed in market-based pay and would tell employees that it was smart to interview with competitors when they had the chance, in order to get a good sense of the market rate for their talent. Many HR people dislike it when employees talk to recruiters, but I always told employees to take the call, ask how much, and send me the number—it’s valuable information.
    1. Discussing the military’s performance during the Iraq War, Donald Rumsfeld, the former defense secretary, once famously said, “You go to war with the army you have, not the army you might want or wish to have at a later time.” When I talk to managers about creating great teams, I tell them to approach the process in exactly the opposite way.

      Yes, approach it in the opposite way!

    2. I replied, “Why bother? We know how this will play out. You’ll write up objectives and deliverables for her to achieve, which she can’t, because she lacks the skills. Every Wednesday you’ll take time away from your real work to discuss (and document) her shortcomings. You won’t sleep on Tuesday nights, because you’ll know it will be an awful meeting, and the same will be true for her. After a few weeks there will be tears. This will go on for three months. The entire team will know. And at the end you’ll fire her. None of this will make any sense to her, because for five years she’s been consistently rewarded for being great at her job—a job that basically doesn’t exist anymore. Tell me again how Netflix benefits?

      Trying to remedy a situation where someone has "been consistently rewarded for being great at their job" and then working on a PIP with them really is a miserable process.

    3. HR people can’t believe that a company the size of Netflix doesn’t hold annual reviews. “Are you making this up just to upset us?” they ask. I’m not. If you talk simply and honestly about performance on a regular basis, you can get good results—probably better ones than a company that grades everyone on a five-point scale.
    1. Traditional corporate performance reviews are driven largely by fear of litigation. The theory is that if you want to get rid of someone, you need a paper trail documenting a history of poor achievement. At many companies, low performers are placed on “Performance Improvement Plans.” I detest PIPs. I think they’re fundamentally dishonest: They never accomplish what their name implies.
    2. Eliminating a formal policy and forgoing expense account police shifted responsibility to frontline managers, where it belongs.
    1. Instead, we tried really hard to not hire those people, and we let them go if it turned out we’d made a hiring mistake.
    2. One day I was talking with one of our best engineers, an employee I’ll call John. Before the layoffs, he’d managed three engineers, but now he was a one-man department working very long hours. I told John I hoped to hire some help for him soon. His response surprised me. “There’s no rush—I’m happier now,” he said. It turned out that the engineers we’d laid off weren’t spectacular—they were merely adequate. John realized that he’d spent too much time riding herd on them and fixing their mistakes. “I’ve learned that I’d rather work by myself than with subpar performers,” he said. His words echo in my mind whenever I describe the most basic element of Netflix’s talent philosophy: The best thing you can do for employees—a perk better than foosball or free sushi—is hire only “A” players to work alongside them. Excellent colleagues trump everything else.
    3. Despite her work ethic, her track record, and the fact that we all really liked her, her skills were no longer adequate. Some of us talked about jury-rigging a new role for her, but we decided that wouldn’t be right. So I sat down with Laura and explained the situation—and said that in light of her spectacular service, we would give her a spectacular severance package. I’d braced myself for tears or histrionics, but Laura reacted well: She was sad to be leaving but recognized that the generous severance would let her regroup, retrain, and find a new career path. This incident helped us create the other vital element of our talent management philosophy: If we wanted only “A” players on our team, we had to be willing to let go of people whose skills no longer fit, no matter how valuable their contributions had once been. Out of fairness to such people—and, frankly, to help us overcome our discomfort with discharging them—we learned to offer rich severance packages.
    1. Rule of thumb: When pulling changes from origin/develop onto your local develop use rebase. When finishing a feature branch merge the changes back to develop.
    1. If Master has diverged since the feature branch was created, then merging the fea - ture branch into master will create a merge commit. This is a typical merge.
    1. People may use merge commits to represent the last deployed version of production code. That’s an antipattern. Use tags.
    2. Treat yourself as a writer and approach each commit as a chapter in a book. Writers don’t publish first drafts. Michael Crichton said, “Great books aren’t written– they’re rewritten.”
    3. Git is revolutionary because it gives you the best of both worlds. You can regularly check in changes while prototyping a solution but deliver a clean history when you’re finished. When this is your goal, Git’s defaults make a lot more sense.

      Git gets this basic division of worlds right and is a fundamental departure from other version control systems like SVN. The feature that enables all this is nearly cost-free, instantaneous branching.

      What makes this new world complex is not due to git, but instead because the world is, quite simply, complex! Good tools like git help us manage (some of) the complexity.

    4. If you’re fighting Git’s defaults, ask why. Treat public history as immutable, atomic, and easy to follow. Treat private history as disposable and malleable. The intended workflow is: Create a private branch off a public branch. Regularly commit your work to this private branch. Once your code is perfect, clean up its history. Merge the cleaned-up branch back into the public branch.

      Good defaults are sometimes hard to recognize, especially when the tool is complex.

      Questioning the defaults-- and deciding why you would keep them or change them-- is a good antidote to dismissing something due to not understanding it.

      If you can't understand why you don't like the defaults, then decide what you would choose instead and why you would change the default as it stands. Does the default make it easy to do the "right" thing AND hard to do the "wrong" thing? The second part of that statement is the most important since it might not be obvious what the "right" thing is.

      Even if you don't like the defaults, ask yourself if they continually lead you away from perils and problems that would plague you if a different set of defaults were chosen?

  3. Nov 2013
    1. n its space-time representation (Ogata, 1998), the ETASmodel is a temporal marked point process model, and a special case of marked Hawks process, withconditional intensity function(t;x;yjHt) =(x;y) +Xti<tk(mi)g(tti)f(xxi;yyijmi)

      Testing out PDF annotation that also include LaTeX rendered formulas.

  4. Oct 2013
    1. This doesn’t mean gossip is always good.

      I'm glad to see a statement that the results from this experiment in "procosial gossip" does not translate into meaning that gossip is always good.

      I wonder what insights we can take away from this and turn into action in our own environment?

      I feel the antidote we need is more meaningful and real-time feedback, not gossip.

    2. “Witnessing the unfair play,” the researchers write, “led to elevated heart rates for participants who had no opportunity to gossip.”

      I think I have seen this stress response in people when I have listened to them talk about witnessing unfair play.

  5. Sep 2013
    1. Welcome one-on-ones Career planning

      These conversations are important to me. Let's keep having them and having more of them.

    2. Blameless post-mortems

      Tim, thanks for organizing the very first blameless post-mortem-- an important step in our emergency and incident response that will lead to a better system and organization overall.

    3. Create slack time for important improvement projects

      This is one of the intended effects of The Gardener role. By centralizing the duty of interrupt handling into one person's job it will free up time for each of the rest of us to focus on projects most of the time, and only occasionally every couple of months will we each have to worry about interrupts when the role of The Gardener passes to one of us for the week.

    4. They also started to standardize and very deliberately reduce the supported infrastructure and configurations. One decision was to switch everything to PHP and MySQL. This was a philosophical decision, not a technology one: they wanted both Dev and Ops to be able to understand the stack, so that everyone can contribute if they wanted to, as well as enabling everyone to be able to read, rewrite and fix someone else’s code.

      NOTE: "This was a philosophical decision, not a technology one."

    5. and most importantly, a culture that the rest of the world admires.

      Starting with our group here in IDSG, I would like us to lead the way for EECS, CoE, Berkeley, and the UC in fostering a culture that the people around us admire.

    6. Visible Ops Handbook
    7. They have events like “Meetsy” (suggested lunch groups to meet people you may not work directly with) and “Eatsy” (where the entire company eats together).

      Don't eat alone.

      If all we're going to do is talk, let's eat or have a drink in our hands!