HackerNews Digest Daily

Subscribe
Archives
November 22, 2023

Hacker News Top Stories with Summaries (November 22, 2023)

    <style>
        p {
            font-size: 16px;
            line-height: 1.6;
            margin: 0;
            padding: 10px;
        }
        h1 {
            font-size: 24px;
            font-weight: bold;
            margin-top: 10px;
            margin-bottom: 20px;
        }
        h2 {
            font-size: 18px;
            font-weight: bold;
            margin-top: 10px;
            margin-bottom: 5px;
        }
        ul {
            padding-left: 20px;
        }
        li {
            margin-bottom: 10px;
        }
        .summary {
            margin-left: 20px;
            margin-bottom: 20px;
        }
    </style>
        <h1> Hacker News Top Stories</h1>
        <p>Here are the top stories from Hacker News with summaries for November 22, 2023 :</p>

    <div style="margin-bottom: 20px;">
        <table cellpadding="0" cellspacing="0" border="0">
            <tr>
                <td style="padding-right: 10px;">
                <div style="width: 200px; height: 100px; border-radius: 10px; overflow: hidden; background-image: url('https://hackernewstoemail.s3.us-east-2.amazonaws.com/hnd2'); background-size: cover; background-position: center;">

We have reached an agreement in principle for Sam to return to OpenAI as CEO

https://twitter.com/openai/status/1727206187077370115

Summary: JavaScript disabled in browser; enable or switch to supported browser for twitter.com. Visit Help Center for list of supported browsers. Terms, Privacy, Cookie Policy, Imprint, Ads info. © 2023 X Corp. Retry if error occurs.

    <div style="margin-bottom: 20px;">
        <table cellpadding="0" cellspacing="0" border="0">
            <tr>
                <td style="padding-right: 10px;">
                <div style="width: 200px; height: 100px; border-radius: 10px; overflow: hidden; background-image: url('https://static.arxiv.org/static/browse/0.3.4/images/arxiv-logo-fb.png'); background-size: cover; background-position: center;">

Ultra Fast Bert

https://arxiv.org/abs/2311.10770

Summary: Researchers Peter Belcak and Roger Wattenhofer have developed UltraFastBERT, a BERT variant that uses only 0.3% of its neurons during inference while maintaining performance. By replacing feedforward networks with fast feedforward networks (FFFs), UltraFastBERT engages just 12 out of 4095 neurons per layer. This results in a 78x speedup over the optimized baseline feedforward implementation and a 40x speedup over equivalent batched feedforward inference in PyTorch.

Want to read the full issue?
Powered by Buttondown, the easiest way to start and grow your newsletter.