• Newly leaked internal documents reviewed by TechCrunch reveal that Microsoft has already received $865.9 million from OpenAI through a 20% revenue-share deal covering the first three quarters of 2025.

    This payout is far lower than the multibillion-dollar revenue figures the company has often implied publicly.

    The documents also show that OpenAI’s inference costs reached an estimated $12.4 billion from early 2024 through Q3 2025, underscoring how expensive it is to run large AI models at scale.

    These costs are significantly higher than what many analysts previously assumed.

    The leak raises questions about the sustainability of OpenAI’s business model and whether its rapid growth is being fueled more by massive compute spending than actual profitability.

    It also highlights how dependent OpenAI remains on Microsoft’s infrastructure.

    If accurate, the numbers suggest that OpenAI’s financial reality is much more fragile than its public image.

    Investors and partners may now push for transparency around true revenue, margins, and long-term viability.

    Follow us (@artificialintelligenceee) for everything latest from the AI world.

    Source: TechCrunch
    Newly leaked internal documents reviewed by TechCrunch reveal that Microsoft has already received $865.9 million from OpenAI through a 20% revenue-share deal covering the first three quarters of 2025. This payout is far lower than the multibillion-dollar revenue figures the company has often implied publicly. The documents also show that OpenAI’s inference costs reached an estimated $12.4 billion from early 2024 through Q3 2025, underscoring how expensive it is to run large AI models at scale. These costs are significantly higher than what many analysts previously assumed. The leak raises questions about the sustainability of OpenAI’s business model and whether its rapid growth is being fueled more by massive compute spending than actual profitability. It also highlights how dependent OpenAI remains on Microsoft’s infrastructure. If accurate, the numbers suggest that OpenAI’s financial reality is much more fragile than its public image. Investors and partners may now push for transparency around true revenue, margins, and long-term viability. Follow us (👉@artificialintelligenceee) for everything latest from the AI world. Source: TechCrunch
    ·53 Views ·0 Vista previa
  • Microsoft and Nvidia have teamed up to invest in Anthropic, marking one of the biggest alliances in the AI industry this year.

    As part of the deal, Anthropic will spend $30 billion on Microsoft’s Azure cloud, tightening the competition around frontier AI models.

    Nvidia is expected to invest up to $10 billion, while Microsoft will put in up to $5 billion. Both companies are joining Anthropic’s next funding round, a move analysts say is designed to reduce their dependence on OpenAI.

    Satya Nadella said Microsoft will continue working with OpenAI, but the company does not want to rely on a single frontier model provider.

    The partnership also deepens Nvidia’s position in the AI ecosystem. The chipmaker will work with Anthropic on next-generation hardware, including efforts tied to its Grace Blackwell and Vera Rubin platforms.

    Anthropic has committed up to 1 gigawatt of compute using Nvidia systems. Industry executives estimate this setup could cost more than $20 billion.

    Anthropic is now one of the fastest-growing AI companies, valued at around $183 billion and powered by strong enterprise demand.

    The company expects its revenue run rate to climb to nearly $26 billion next year, supported by more than 300,000 business customers.

    Follow us (@artificialintelligenceee) for everything latest from the AI world.

    Source: CNBC
    Microsoft and Nvidia have teamed up to invest in Anthropic, marking one of the biggest alliances in the AI industry this year. As part of the deal, Anthropic will spend $30 billion on Microsoft’s Azure cloud, tightening the competition around frontier AI models. Nvidia is expected to invest up to $10 billion, while Microsoft will put in up to $5 billion. Both companies are joining Anthropic’s next funding round, a move analysts say is designed to reduce their dependence on OpenAI. Satya Nadella said Microsoft will continue working with OpenAI, but the company does not want to rely on a single frontier model provider. The partnership also deepens Nvidia’s position in the AI ecosystem. The chipmaker will work with Anthropic on next-generation hardware, including efforts tied to its Grace Blackwell and Vera Rubin platforms. Anthropic has committed up to 1 gigawatt of compute using Nvidia systems. Industry executives estimate this setup could cost more than $20 billion. Anthropic is now one of the fastest-growing AI companies, valued at around $183 billion and powered by strong enterprise demand. The company expects its revenue run rate to climb to nearly $26 billion next year, supported by more than 300,000 business customers. Follow us (👉@artificialintelligenceee) for everything latest from the AI world. Source: CNBC
    ·96 Views ·0 Vista previa
  • Polish has unexpectedly emerged as a powerhouse language for artificial intelligence. A new benchmark developed by researchers at the University of Maryland, Microsoft, and UMass Amherst shows that when AI models are pushed into long-context tasks, Polish outperforms 25 other languages, including English and Chinese. The test, called ONERULER, evaluated how well major systems from OpenAI, Google, Meta, Qwen, and DeepSeek could retrieve and synthesize information across documents stretching up to 128,000 tokens.⁠

    The results flip long-held assumptions about linguistic dominance in machine learning. English and Chinese may saturate global training data, but abundant data does not guarantee deeper comprehension. Under heavy context loads, models handled Polish with an average accuracy of 88 percent, placing English sixth and Chinese near the bottom. Slavic and Romance languages consistently scored well, hinting that inflected grammar, Latin or Cyrillic scripts, and more regular syntactic patterns may help models track meaning across long passages.⁠

    That advantage becomes even clearer in demanding “needle-in-a-haystack” tasks, where systems must surface a single buried detail from a book-length text. Polish not only held its lead but widened it, suggesting that the structure of a language can shape how effectively a model encodes relationships within sprawling inputs. Meanwhile, low-resource languages such as Swahili and Sesotho struggled, and Chinese models showed particular difficulty, revealing how tokenization and writing systems influence model behavior.⁠

    The findings arrive at a moment when Poland is investing heavily in national AI efforts, including its own large language model, PLLuM. The study underscores a broader lesson for the field: multilingual diversity is not just a cultural goal, it is a technical one, and languages with smaller global footprints may hold surprising advantages for the next generation of AI.⁠

    Source: 10.48550/arXiv.2503.01996
    Polish has unexpectedly emerged as a powerhouse language for artificial intelligence. A new benchmark developed by researchers at the University of Maryland, Microsoft, and UMass Amherst shows that when AI models are pushed into long-context tasks, Polish outperforms 25 other languages, including English and Chinese. The test, called ONERULER, evaluated how well major systems from OpenAI, Google, Meta, Qwen, and DeepSeek could retrieve and synthesize information across documents stretching up to 128,000 tokens.⁠ ⁠ The results flip long-held assumptions about linguistic dominance in machine learning. English and Chinese may saturate global training data, but abundant data does not guarantee deeper comprehension. Under heavy context loads, models handled Polish with an average accuracy of 88 percent, placing English sixth and Chinese near the bottom. Slavic and Romance languages consistently scored well, hinting that inflected grammar, Latin or Cyrillic scripts, and more regular syntactic patterns may help models track meaning across long passages.⁠ ⁠ That advantage becomes even clearer in demanding “needle-in-a-haystack” tasks, where systems must surface a single buried detail from a book-length text. Polish not only held its lead but widened it, suggesting that the structure of a language can shape how effectively a model encodes relationships within sprawling inputs. Meanwhile, low-resource languages such as Swahili and Sesotho struggled, and Chinese models showed particular difficulty, revealing how tokenization and writing systems influence model behavior.⁠ ⁠ The findings arrive at a moment when Poland is investing heavily in national AI efforts, including its own large language model, PLLuM. The study underscores a broader lesson for the field: multilingual diversity is not just a cultural goal, it is a technical one, and languages with smaller global footprints may hold surprising advantages for the next generation of AI.⁠ ⁠ Source: 10.48550/arXiv.2503.01996
    ·195 Views ·0 Vista previa
  • The Biggest Gains Come After Year 10

    Everyone wants to double their money now.
    But real wealth in stocks doesn’t work that way.

    Amazon, Google, Tesla… all looked like “just okay” companies in their first 10 years.
    The real magic—96% of their total value—came after that.

    Why?
    Because compounding takes time.
    The early years are quiet.
    The late years are explosive.

    But most investors don’t wait long enough.
    They sell after 2 years of “meh” returns…
    Right before the rocket takes off.

    So what’s the lesson?
    If you’ve done the work and picked a great business—give it time.
    Holding through the boring years is what separates average investors from wealthy ones.

    In long-term investing, patience isn’t just a virtue—it’s a weapon.

    Follow @masteringwealth & @goodstudent_investing for the best investing content on Instagram

    Source & credits: NFX, 2022

    Note: Post includes opinions, not investment advice.
    .
    .
    #investing101 #investingstrategy #stockmarkets #msft #aapl #applestock #dividends #tsla #teslastock #dividendgrowthstocks #microsoft #billgates #dividendinvesting #investingeducation #stockstowatch #stockstobuy #stockstohold #stockmarketnews #stockmarket #hustle #nyse #nasdaq #investing101 #stocks #stockstotrade #intelligentinvesting #elonmusk #stevejobs
    The Biggest Gains Come After Year 10 ⏳ Everyone wants to double their money now. But real wealth in stocks doesn’t work that way. Amazon, Google, Tesla… all looked like “just okay” companies in their first 10 years. The real magic—96% of their total value—came after that. Why? Because compounding takes time. The early years are quiet. The late years are explosive. But most investors don’t wait long enough. They sell after 2 years of “meh” returns… Right before the rocket takes off. So what’s the lesson? If you’ve done the work and picked a great business—give it time. Holding through the boring years is what separates average investors from wealthy ones. 📈 In long-term investing, patience isn’t just a virtue—it’s a weapon. 🔥 Follow @masteringwealth & @goodstudent_investing for the best investing content on Instagram 🔥 Source & credits: NFX, 2022 📝Note: Post includes opinions, not investment advice. . . #investing101 #investingstrategy #stockmarkets #msft #aapl #applestock #dividends #tsla #teslastock #dividendgrowthstocks #microsoft #billgates #dividendinvesting #investingeducation #stockstowatch #stockstobuy #stockstohold #stockmarketnews #stockmarket #hustle #nyse #nasdaq #investing101 #stocks #stockstotrade #intelligentinvesting #elonmusk #stevejobs
    ·440 Views ·0 Vista previa
  • Xbox CEO Phil Spencer congratulated Valve after the reveal of its new Steam Machine console, emphasizing that gaming progresses when players and developers have more ways to play and create. In his post on X, Spencer said expanding access across PC, console, and handhelds reflects a future built on choice — values Xbox has always supported. He also mentioned Xbox is one of the largest publishers on Steam, where titles like Minecraft and Call of Duty perform strongly. With Microsoft preparing its next console and leaning toward fewer exclusives, many wonder if Xbox could take a similar approach. Do you think Xbox will follow Valve’s lead?

    #PhilSpencer #SteamMachine

    [Follow @gamenewsplusnet]

    Hashtags:

    #Gaming #VideoGames #Game #Gamer #GameNewsPlus
    Xbox CEO Phil Spencer congratulated Valve after the reveal of its new Steam Machine console, emphasizing that gaming progresses when players and developers have more ways to play and create. In his post on X, Spencer said expanding access across PC, console, and handhelds reflects a future built on choice — values Xbox has always supported. He also mentioned Xbox is one of the largest publishers on Steam, where titles like Minecraft and Call of Duty perform strongly. With Microsoft preparing its next console and leaning toward fewer exclusives, many wonder if Xbox could take a similar approach. Do you think Xbox will follow Valve’s lead? #PhilSpencer #SteamMachine [Follow @gamenewsplusnet] Hashtags: #Gaming #VideoGames #Game #Gamer #GameNewsPlus
    ·161 Views ·0 Vista previa
  • Reuters reports that OpenAI is preparing for an IPO that could value the company at as much as $1 trillion.

    The company recently transitioned into a Public Benefit Corporation under its nonprofit foundation.

    A secondary share sale in October 2025 valued OpenAI at $500 billion.

    Discussions point to a possible IPO between late 2026 and 2027, with the offering expected to raise at least $60 billion to support compute and data center expansion.

    After its recent restructuring, Microsoft owns roughly 27%, while the nonprofit foundation still holds a significant share.

    If realized, this move could redefine AI investment and public access to major AI firms.

    #ai #artificialintelligence #aitools #aihacks #chatgpt #tech #technology
    💰🤖 Reuters reports that OpenAI is preparing for an IPO that could value the company at as much as $1 trillion. The company recently transitioned into a Public Benefit Corporation under its nonprofit foundation. A secondary share sale in October 2025 valued OpenAI at $500 billion. Discussions point to a possible IPO between late 2026 and 2027, with the offering expected to raise at least $60 billion to support compute and data center expansion. After its recent restructuring, Microsoft owns roughly 27%, while the nonprofit foundation still holds a significant share. If realized, this move could redefine AI investment and public access to major AI firms. #ai #artificialintelligence #aitools #aihacks #chatgpt #tech #technology
    ·141 Views ·0 Vista previa
  • Microsoft’s AI CEO Mustafa Suleyman introduced the MAI Superintelligence Team, a new division focused on creating powerful AI systems that address global issues like medicine, clean energy, and education instead of chasing open-ended AGI.

    The initiative reflects Suleyman’s idea of “Humanist Superintelligence,” meaning AI designed to serve humanity rather than outpace it. The goal is to build technology that uplifts people, improves health, and powers sustainable solutions for the planet.

    This team will prioritize practical breakthroughs such as medical superintelligence, AI learning companions, and advanced energy systems that can reshape essential sectors of society.

    With this step, Microsoft takes a clear stance in the AI race, focusing on purpose, safety, and human progress over limitless intelligence.

    Follow us (@artificialintelligenceee) for everything latest from the AI world.

    Source: https://microsoft.ai/news/towards-humanist-superintelligence/
    Microsoft’s AI CEO Mustafa Suleyman introduced the MAI Superintelligence Team, a new division focused on creating powerful AI systems that address global issues like medicine, clean energy, and education instead of chasing open-ended AGI. The initiative reflects Suleyman’s idea of “Humanist Superintelligence,” meaning AI designed to serve humanity rather than outpace it. The goal is to build technology that uplifts people, improves health, and powers sustainable solutions for the planet. This team will prioritize practical breakthroughs such as medical superintelligence, AI learning companions, and advanced energy systems that can reshape essential sectors of society. With this step, Microsoft takes a clear stance in the AI race, focusing on purpose, safety, and human progress over limitless intelligence. Follow us (👉@artificialintelligenceee) for everything latest from the AI world. Source: https://microsoft.ai/news/towards-humanist-superintelligence/
    ·101 Views ·0 Vista previa
  • Sam Altman’s comments about Slack creating endless “fake work” have stirred the tech world, as he spoke about the need for tools that actually get things done instead of just keeping people busy.

    He explained that future workplaces will depend on AI-native productivity systems capable of managing tasks, emails, documents, and meetings without constant human micromanagement, making collaboration smoother and more intelligent.

    Soon after his remarks, Elon Musk reacted online, suggesting that OpenAI’s move toward such tools puts it in direct competition with Microsoft, which already dominates office software with its own AI integrations.

    Musk later went further, warning that continuing to support OpenAI could be “insanely suic*dal” for Microsoft, hinting that their once-powerful alliance might now be heading toward open competition.

    Follow us (@artificialintelligenceee) for everything latest from the AI world.
    Sam Altman’s comments about Slack creating endless “fake work” have stirred the tech world, as he spoke about the need for tools that actually get things done instead of just keeping people busy. He explained that future workplaces will depend on AI-native productivity systems capable of managing tasks, emails, documents, and meetings without constant human micromanagement, making collaboration smoother and more intelligent. Soon after his remarks, Elon Musk reacted online, suggesting that OpenAI’s move toward such tools puts it in direct competition with Microsoft, which already dominates office software with its own AI integrations. Musk later went further, warning that continuing to support OpenAI could be “insanely suic*dal” for Microsoft, hinting that their once-powerful alliance might now be heading toward open competition. Follow us (👉@artificialintelligenceee) for everything latest from the AI world.
    ·103 Views ·0 Vista previa
  • First, a fridge without the fumes. Scientists at Lawrence Berkeley National Laboratory have built a new “ionocaloric” cycle that cools by moving ions through a material to shift its melting point, the same physics behind road salt melting ice. In lab tests, a sodium–iodine salt and ethylene carbonate delivered a 25 °C temperature swing using under one volt, a bigger lift than most solid-state “caloric” approaches and without hydrofluorocarbon refrigerants.⁠

    Because it toggles a solid–liquid phase change, the working fluid can be pumped, avoiding compressors and complex valves. The team’s models suggest efficiency on par with, or better than, today’s vapor-compression systems. Using ethylene carbonate, which can be synthesized from captured CO₂, the refrigerant footprint could be not just low but potentially carbon-negative. If prototypes scale, the same cycle could also supply efficient water and process heating, trimming emissions from buildings and industry that are notoriously hard to decarbonize.⁠

    Now, a data center that chills with the sea. Off Shanghai, Hailanyun’s first commercial underwater AI facility places sealed server pods beneath offshore wind turbines and circulates seawater across radiators to carry heat away. Internal assessments with a Chinese institute report at least 30% lower electricity use for cooling compared with land sites, and the company says the farm is powered 97% by the nearby wind array.⁠

    One operational pod holds 198 racks, enough for roughly 396–792 AI-ready servers, and the company claims capacity to train a GPT-3.5-class model in a day. Microsoft’s earlier Project Natick found submerged servers can fail less often, but scaling raises new risks, including thermal plumes, acoustic sabotage, corrosion, biofouling, and slow maintenance cycles. From ions to oceans, cooling is being rewired for an AI-hungry, climate-strained future.⁠

    #tech #ai #cooling #climate #datacenters #materials #energy #sustainability #berkeleylab

    Source: 10.1126/science.ade1696
    First, a fridge without the fumes. Scientists at Lawrence Berkeley National Laboratory have built a new “ionocaloric” cycle that cools by moving ions through a material to shift its melting point, the same physics behind road salt melting ice. In lab tests, a sodium–iodine salt and ethylene carbonate delivered a 25 °C temperature swing using under one volt, a bigger lift than most solid-state “caloric” approaches and without hydrofluorocarbon refrigerants.⁠ ⁠ Because it toggles a solid–liquid phase change, the working fluid can be pumped, avoiding compressors and complex valves. The team’s models suggest efficiency on par with, or better than, today’s vapor-compression systems. Using ethylene carbonate, which can be synthesized from captured CO₂, the refrigerant footprint could be not just low but potentially carbon-negative. If prototypes scale, the same cycle could also supply efficient water and process heating, trimming emissions from buildings and industry that are notoriously hard to decarbonize.⁠ ⁠ Now, a data center that chills with the sea. Off Shanghai, Hailanyun’s first commercial underwater AI facility places sealed server pods beneath offshore wind turbines and circulates seawater across radiators to carry heat away. Internal assessments with a Chinese institute report at least 30% lower electricity use for cooling compared with land sites, and the company says the farm is powered 97% by the nearby wind array.⁠ ⁠ One operational pod holds 198 racks, enough for roughly 396–792 AI-ready servers, and the company claims capacity to train a GPT-3.5-class model in a day. Microsoft’s earlier Project Natick found submerged servers can fail less often, but scaling raises new risks, including thermal plumes, acoustic sabotage, corrosion, biofouling, and slow maintenance cycles. From ions to oceans, cooling is being rewired for an AI-hungry, climate-strained future.⁠ ⁠ #tech #ai #cooling #climate #datacenters #materials #energy #sustainability #berkeleylab⁠ ⁠ Source: 10.1126/science.ade1696
    ·168 Views ·0 Vista previa
  • China is taking data centers where the cooling is free, underwater. Off Hainan and Shanghai, the country has begun submerging modular “cabin-pods” that house racks of servers beneath 35 meters of seawater, using the ocean’s steady chill as a giant heatsink. A new 24-megawatt prototype near Shanghai taps nearby offshore wind for power, while Hainan’s commercial site marks the first large-scale deployment of this radical idea. The move is part of China’s broader “blue economy” strategy, merging digital infrastructure with maritime innovation.⁠

    Why go to sea? Cooling devours roughly 40% of a data center’s electricity. With data centers already consuming 2–3% of global power and AI demand projected to push that figure 165% higher by 2030, every watt saved matters. In these pods, pumps route cold seawater across radiators behind server racks, slashing the energy and freshwater normally spent on chillers and evaporative towers. Engineers estimate cooling costs can fall by up to 90%, while pairing with offshore wind farms brings total renewable usage to as high as 95%. It’s a compact, self-sustaining ecosystem powered by tides and air currents.⁠

    Reliability is engineered in. Each 1,400-ton pod is sealed and filled with inert nitrogen to halt corrosion, dust, and humidity, a design refined from Microsoft’s Project Natick experiments off Scotland. If something fails, China’s approach is “swap, don’t fix,” haul the module up, replace it onshore, and redeploy within days.⁠

    The plan scales by multiplication. Hainan targets a full subsea network of 100 cabins, and Shanghai’s 24-megawatt unit is a pathfinder for 500-megawatt clusters in the coming decade. It’s an audacious bet that taming AI’s growing heat and water hunger may be easier on the seafloor than on land. If it works, the future architecture of the digital age may not be green at all, but deep blue.
    China is taking data centers where the cooling is free, underwater. Off Hainan and Shanghai, the country has begun submerging modular “cabin-pods” that house racks of servers beneath 35 meters of seawater, using the ocean’s steady chill as a giant heatsink. A new 24-megawatt prototype near Shanghai taps nearby offshore wind for power, while Hainan’s commercial site marks the first large-scale deployment of this radical idea. The move is part of China’s broader “blue economy” strategy, merging digital infrastructure with maritime innovation.⁠ ⁠ Why go to sea? Cooling devours roughly 40% of a data center’s electricity. With data centers already consuming 2–3% of global power and AI demand projected to push that figure 165% higher by 2030, every watt saved matters. In these pods, pumps route cold seawater across radiators behind server racks, slashing the energy and freshwater normally spent on chillers and evaporative towers. Engineers estimate cooling costs can fall by up to 90%, while pairing with offshore wind farms brings total renewable usage to as high as 95%. It’s a compact, self-sustaining ecosystem powered by tides and air currents.⁠ ⁠ Reliability is engineered in. Each 1,400-ton pod is sealed and filled with inert nitrogen to halt corrosion, dust, and humidity, a design refined from Microsoft’s Project Natick experiments off Scotland. If something fails, China’s approach is “swap, don’t fix,” haul the module up, replace it onshore, and redeploy within days.⁠ ⁠ The plan scales by multiplication. Hainan targets a full subsea network of 100 cabins, and Shanghai’s 24-megawatt unit is a pathfinder for 500-megawatt clusters in the coming decade. It’s an audacious bet that taming AI’s growing heat and water hunger may be easier on the seafloor than on land. If it works, the future architecture of the digital age may not be green at all, but deep blue.
    ·196 Views ·0 Vista previa
Resultados de la búsqueda
Techawks - Powered By Pantrade Blockchain https://techawks.com