12 Matching Annotations
  1. May 2021
  2. Mar 2021
  3. Jun 2020
    1. Forlargecaches,sincetheworkingsets tinthecachebeforeandafterthecache,thee ectofinliningisinsigni cant

      Increase in code size may also have effect on the ITLB misses, this is because the static object size will now fit in more number of pages and hence code size expansion optimizations may also cause ITLB misses which stores a portion of page table (virtaul-physical mappings)

  4. Mar 2020
    1. Binding specific physical address spaces to a memory controller along with constraining all the memory request from given set of CPU cores to be directed to particular memory controller makes sure that on a cache miss the memory access request goes to the HOME agent of that specific memory controller, this offers a low latency path and is the core concept behind numa.

    1. this together with multi-port memory can facilitate concurrent non conflicting memory access since each memory controller can only access specific memory banks (address space).

  5. Feb 2020