Latest Tech News

Stay updated with the latest in technology, AI, cybersecurity, and more

Filtered by: moe Clear Filter

Nano-engineered thermoelectrics enable scalable, compressor-free cooling

Researchers at the Johns Hopkins Applied Physics Laboratory (APL) in Laurel, Maryland, have developed a new, easily manufacturable solid-state thermoelectric refrigeration technology with nano-engineered materials that is twice as efficient as devices made with commercially available bulk thermoelectric materials. As global demand grows for more energy-efficient, reliable and compact cooling solutions, this advancement offers a scalable alternative to traditional compressor-based refrigeration.

More Efficient Thermoelectric Cooling

Researchers at the Johns Hopkins Applied Physics Laboratory (APL) in Laurel, Maryland, have developed a new, easily manufacturable solid-state thermoelectric refrigeration technology with nano-engineered materials that is twice as efficient as devices made with commercially available bulk thermoelectric materials. As global demand grows for more energy-efficient, reliable and compact cooling solutions, this advancement offers a scalable alternative to traditional compressor-based refrigeration.

How large are large language models?

How large are large language models? (2025) This aims to be factual information about the size of large language models. None of this document was written by AI. I do not include any information from leaks or rumors. The focus of this document is on base models (the raw text continuation engines, not 'helpful chatbot/assistants'). This is a view from a few years ago to today of one very tiny fraction of the larger LLM story that's happening. History GPT-2,-medium,-large,-xl (2019): 137M, 380M

Topics: data gpt model models moe

How large are large language models? (2025)

How large are large language models? (2025) This aims to be factual information about the size of large language models. None of this document was written by AI. I do not include any information from leaks or rumors. The focus of this document is on base models (the raw text continuation engines, not 'helpful chatbot/assistants'). This is a view from a few years ago to today of one very tiny fraction of the larger LLM story that's happening. History GPT-2,-medium,-large,-xl (2019): 137M, 380M

Topics: data gpt model models moe

Cosmoe: BeOS Class Library on Top of Wayland

The current iteration of Cosmoe is a shared library which implements the BeOS class library on top of Wayland. There are no supporting programs, e.g. app_server or registrar, needed to use it. All the necessary functionality is rolled into the library. Apps linked with the library run natively on Linux via Wayland. The previous iteration of Cosmoe (now known as "Cosmoe Classic") is a full port of the Haiku OS to the Linux kernel. It runs inside an SDL window on Linux. It would be possible to de

Archaeologists Unearth Viking-Era Burial With Incredibly Rare Casket

Archaeologists from Denmark’s Moesgaard Museum have uncovered 30 Viking Age graves dating from 800 to 1050 CE, just under five miles north of Aarhus. Located near the town of Lisbjerg, the burial site has yielded a number of spectacular objects hinting at ties with Danish royalty. “The burial site is most likely connected to the Viking-era manor in Lisbjerg, which is less than a kilometer from the burial site,” Mads Ravn, an archaeologist from Moesgaard and Viking Age expert, explained in the M