285 Matching Annotations
  1. May 2021
    1. Der Guardian beginnt eine neue Studie zum Gletscher-Verlust. Die Gletscher (ohne die auf Grönland und in der Antarktis) tragen zirka 20% zum globalen Anstieg des Meeresspiegels bei, das sind zur Zeit etwa 0,74 mm im Jahr. Die Rate, mit der sie dünner werden, hat sich in 20 Jahren verdoppelt. Besonders hoch sind die Verluste in den Alpen. Im Durchschnitt haben sie im Jahr 267 Gigatonnen verloren.

      'We Need to Act Now': Glaciers Melting at Unprecedented Pace, Study Reveals - EcoWatch

    1. Einer im Fachmagazin »Nature« veröffentlichten internationalen Studie zufolge verloren die Gletscher zwischen 2000 und 2019 im Durchschnitt 267 Milliarden Tonnen (Gigatonnen) Eis pro Jahr, am meisten aber in den vergangenen fünf Jahren. Inzwischen trägt das schmelzende Eis demnach zu mehr als 20 Prozent zum Anstieg des Meeresspiegels bei.
  2. Apr 2021
  3. Mar 2021
  4. Jan 2021
    1. And Unity ditching for something that’s still not on par with it, had already broken a bit my trust in Ubuntu as a stable option at work. Now snap is coming closer and broader…
  5. Oct 2020
    1. Before you start a weight-loss program, it’s crucial to identify and create a treatment plan for any obesity related illnesses or diseases.

      Find out more about medical weight loss here.

  6. Sep 2020
    1. Knowing this, if you want someone to make a decision they might consider risky (like abandoning an age-old software platform for something that works), it helps to talk about the bad things that will happen if they don’t take the risk. They’re more apt to respond to that than if you talk about the good things that will happen if they take the risk. In fact, talking about positive outcomes makes people more risk-averse (http://bkaprt.com/dcb/03-12/).
    2. loss aversion. We are way more scared of losing what we have than excited about getting something new.

    Tags

    Annotators

  7. Aug 2020
    1. Altig, D., Baker, S. R., Barrero, J. M., Bloom, N., Bunn, P., Chen, S., Davis, S. J., Leather, J., Meyer, B. H., Mihaylov, E., Mizen, P., Parker, N. B., Renault, T., Smietanka, P., & Thwaites, G. (2020). Economic Uncertainty Before and During the COVID-19 Pandemic (Working Paper No. 27418; Working Paper Series). National Bureau of Economic Research. https://doi.org/10.3386/w27418

  8. Jul 2020
    1. "that text has been removed from the official version on the Apache site." This itself is also not good. If you post "official" records but then quietly edit them over time, I have no choice but to assume bad faith in all the records I'm shown by you. Why should I believe anything Apache board members claim was "minuted" but which in fact it turns out they might have just edited into their records days, weeks or years later? One of the things I particularly watch for in modern news media (where no physical artefact captures whatever "mistakes" are published as once happened with newspapers) is whether when they inevitably correct a mistake they _acknowledge_ that or they instead just silently change things.
  9. Jun 2020
    1. Barry, D., Buchanan, L., Cargill, C., Daniel, A., Delaquérière, A., Gamio, L., Gianordoli, G., Harris, R., Harvey, B., Haskins, J., Huang, J., Landon, S., Love, J., Maalouf, G., Matthews, A., Mohamed, F., Moity, S., Royal, D.-C., Ruby, M., & Weingart, E. (2020, May 27). Remembering the 100,000 Lives Lost to Coronavirus in America. The New York Times. https://www.nytimes.com/interactive/2020/05/24/us/us-coronavirus-deaths-100000.html

    1. but it launched with a plethora of issues that resulted in users rejecting it early on. Edge has since struggled to gain traction, thanks to its continued instability and lack of mindshare, from users and web developers.
  10. May 2020
  11. Apr 2020
  12. Oct 2019
    1. the generator and discriminator losses derive from a single measure of distance between probability distributions. In both of these schemes, however, the generator can only affect one term in the distance measure: the term that reflects the distribution of the fake data. So during generator training we drop the other term, which reflects the distribution of the real data.

      Loss of GAN- How the two loss function are working on GAN training

  13. Apr 2019
    1. We all know health is wealth. Protein plays an important role in keeping our body healthy and perfect. Whey protein isolate is the perfect source for protein and other essential nutrients which our body required. To help your body to get in proper shape, elements like exercise, food, and water are not enough. Whey protein isolate powder contains less than 1% lactose which helps the people who have lactose intolerance. Whey protein powder helps for weight loss and also to give shape your body.

  14. Feb 2019
    1. Deep Learning on Small Datasets without Pre-Training using Cosine Loss

      在当代深度学习中,有两件事似乎无可争议:

      1. softmax激活后的分类交叉熵损失是分类的首选方法;
      2. 在小型数据集上从零开始训练CNN分类器效果不佳。在本文中作者证明,当处理小数据样本类时余弦损失函数比交叉上能够提供更好的性能。
    2. Towards a Deeper Understanding of Adversarial Losses

      研究了各种对抗生成训练的 losses,还可以 know which one of them makes an adversarial loss better than another。

  15. Jan 2019
    1. Training Neural Networks with Local Error Signals

      自 GoogLeNet 之后,local loss 这个 idea 恐怕并不新鲜了吧~

    2. Eliminating all bad Local Minima from Loss Landscapes without even adding an Extra Unit

      好猛的 paper,全文就一页,仅两个引用!一个简单的 idea:引入两个辅助参数,使得新 loss 的任何局部极小都是原 loss 的全局最小。小槽点:

      1. 一点实验都不做,都留给读者,真的合适么?[笑cry]

      2. 配图很好看,主要是感觉和 “Neural Ordinary Differential Equations” 很像~

    3. Learning with Fenchel-Young Losses

      又见到 Blondel 的关于 Fenchel-Young Losses 的 paper,虽然看不懂,不过不明觉厉~

  16. Nov 2018
    1. Here is how you reach net profit on a P&L (Profit & Loss) account: Sales revenue = price (of product) × quantity sold Gross profit = sales revenue − cost of sales and other direct costs Operating profit = gross profit − overheads and other indirect costs EBIT (earnings before interest and taxes) = operating profit + non-operating income Pretax profit (EBT, earnings before taxes) = operating profit − one off items and redundancy payments, staff restructuring − interest payable Net profit = Pre-tax profit − tax Retained earnings = Profit after tax − dividends

      $$Sales Revenue = (Price Of Product) - (Quantity Sold)$$

      $$Gross Profit = (Sales Revenue) - (Cost)$$

      $$Operating Profit = (Gross Profit) - (Overhead)$$

      Earnings Before Interest and Taxes (EBIT) $$EBIT = (Operating Profit) + (Non-Operating Income)$$ Earnings Before Taxes (EBT) $$EBT = (Operating Profit) - (One Off Items, Redundancy Payments, Staff Restructuring) - (Interest Payable$$

      $$Net Profit = (EBT) - (Tax)$$

      $$ Retained Earnings = (Net Profit) - (Dividends)$$

    1. Smooth Loss Functions for Deep Top-k Classification

      其实还是挺有创意的~ 通过推广 Multi-class SVM 的 loss,进一步构造光滑性 (无限可微),其可 reduce 回到交叉熵 loss,实验给出对噪声更好的鲁棒性,顺道讨论了如何降低“光滑性”带来的算法复杂度。

    1. For house staff in internal medicine, the introduc-tion of hospitalists may mean a greater likelihood ofbeing supervised by attending physicians who arehighly skilled and experienced in providing inpatientcare. House staff have long enjoyed a certain amountof autonomy, because many of their faculty supervi-sors have been relatively unfamiliar with moderninpatient care. Such autonomy may be diminishedwith the new approach to inpatient care. Althoughthere is bound to be transitional pain, we believethat the potential for improved inpatient teachingwill more than compensate for it. Moreover, thischange will help answer public calls for closer andmore effective faculty oversight of house staff andstudents.34
  17. May 2018
  18. Mar 2018
  19. Jul 2017
    1. Partial loss-of-func- tion alleles cause the preferential loss of ventral structures and the expansion of remaining lateral and dorsal struc- tures (Figure 1 c) (Anderson and Niisslein-Volhard, 1988). These loss-of-function mutations in spz produce the same phenotypes as maternal effect mutations in the 10 other genes of the dorsal group.

      This paper has been curated by Flybase.

  20. Feb 2017
    1. SVM only cares that the difference is at least 10

      The margin seems to be manually set by the creator in the loss function. In the sample code, the margin is 1-- so the incorrect class has to be scored lower than the correct class by 1.

      How is this margin determined? It seems like one would have to know the magnitude of the scores beforehand.

      Diving deeper, is the scoring magnitude always the same if the parameters are normalized by their average and scaled to be between 0 and 1? (or -1 and -1... not sure of the correct scaling implementation)

      Coming back to the topic -- is this 'minimum margin' or delta a tune-able parameter?

      What effects do we see on the model by adjusting this parameter?

      What are best and worst case scenarios of playing with this parameter?

  21. Oct 2016
    1. This exercise is equivalent to a thousand crunches

      All women want to look flat and marked abdomen. We know that to achieve this we must do crunches and how many do not like the idea, we put aside this option.

      For this not to happen to you and you achieve the figure you so desire, take note of this exercise is better than the abdominals : the iron.

      You can also see: Exercises to reduce belly

      A study from Mayo Clinic shows that the effectiveness of this exercise is to stay a few seconds putting pressure on the muscles of the abdomen and this at a time and making constant frequency may amount to do a thousand crunches.

      Dare to do

      Place your hands firmly on the ground, keep your shoulders and neck straight, concentrated pressure on the muscles of your abdomen. Read more info about Fat Diminisher

      Try to keep your buttocks straight and breathe quietly, that will help you stay longer in this position.

      For this exercise effectively work your abdominal area must keep your body in this position as long as possible.

    1. who was once handsome and tall as you

      Obviously on some level a warning about our helplessness in the face of death, but also reminds me of Marie talking about her childhood feeling free in the mountains. She was "free" and Phlebas was "handsome and tall," but the trajectory seems to point down for everyone in more ways than physical as they approach death (whether by old age or not).

    2. In the mountains, there you feel free. I read, much of the night, and go south in the winter.

      After all this time talking about the comforts of winter, Marie says she goes south in the winter now -- presumably also staying away from the mountains that made her feel free as a child.

    3. The river’s tent is broken: the last fingers of leaf Clutch and sink into the wet bank.

      The "magic" is now gone from a "magical" place that once inspired poets to write about love and beauty- now it's empty and becoming polluted.

  22. Jan 2016
    1. I realize that it doesn’t seem as significant as the fantastic imagery of the imagination which you have come to believe is reality, and which it now seems to you that you are ignoring and therefore separating yourself from. But as has been said, “the way is straight and narrow, and few there by that go in thereat.” 2 And this is because it feels like the loss of immense, all-inclusive, total, images, concepts and belief structures called reality, but which necessarily has a small “r.”

      It feels like a loss, yet this is in 3d.

  23. Feb 2014