Ruby

Is Ruby Still a 'Serious' Programming Language? (wired.com) 80

Wired published an article by California-based writer/programmer Sheon Han arguing that Ruby "is not a serious programming language."

Han believes that the world of programming has "moved on", and "everything Ruby does, another language now does better, leaving it without a distinct niche. Ruby is easy on the eyes. Its syntax is simple, free of semicolons or brackets. More so even thanPython — a language known for its readability — Ruby reads almost like plain English... Ruby, you might've guessed, is dynamically typed. Python and JavaScript are too, but over the years, those communities have developed sophisticated tools to make them behave more responsibly. None of Ruby's current solutions are on par with those. It's far too conducive to what programmers call "footguns," features that make it all too easy to shoot yourself in the foot.

Critically, Ruby's performance profile consistently ranks near the bottom (read: slowest) among major languages. You may remember Twitter's infamous "fail whale," the error screen with a whale lifted by birds that appeared whenever the service went down. You could say that Ruby was largely to blame. Twitter's collapse during the 2010 World Cup served as a wake-up call, and the company resolved to migrate its backend to Scala, a more robust language.

The move paid off: By the 2014 World Cup, Twitter handled a record 32 million tweets during the final match without an outage. Its new Scala-based backend could process up to 100 times faster than Ruby. In the 2010s, a wave of companies replaced much of their Ruby infrastructure, and when legacy Ruby code remained, new services were written in higher-performance languages.

You may wonderwhy people are still using Ruby in 2025. It survives because of its parasitic relationship with Ruby on Rails, the web framework that enabled Ruby's widespread adoption and continues to anchor its relevance.... Rails was the framework of choice for a new generation of startups. The main code bases of Airbnb, GitHub, Twitter, Shopify, and Stripe were built on it.

He points out on Stack Overflow's annual developer survey, Ruby has slipped from a top-10 technology in 2013 to #18 this year — "behind evenAssembly" — calling Ruby "a kind of professional comfort object, sustained by the inertia of legacy code bases and the loyalty of those who first imprinted upon it." But the article drew some criticism on X.com. ("You should do your next piece about how Vim isn't a serious editor and continue building your career around nerd sniping developers.")

Other reactions...
  • "Maybe WIRED is just not a serious medium..."
  • "FWIW — Ruby powered Shopify through another Black Friday / Cyber Monday — breaking last year's record."
  • "Maybe you should have taken a look at TypeScript..."

Wired's subheading argues that Ruby "survives on affection, not utility. Let's move on." Are they right? Share your own thoughts and experiences in the comments.

Is Ruby still a 'serious' programming language?


Programming

Could C# Overtake Java in TIOBE's Programming Language Popularity Rankings? (techrepublic.com) 100

It's been trying to measure the popularity of programming languages since 2000 using metrics like the number of engineers, courses, and third-party vendors. And "The November 2025 TIOBE Index brings another twist below Python's familiar lead," writes TechRepublic. "C solidifies its position as runner-up, C++ and Java lose some ground, and C# moves sharply upward, narrowing the gap with Java to less than a percentage point..."

TIO CEO Paul Jansen said this month that "Instead of Python, programming language C# is now the fastest rising language," How did C# achieve this? Java and C# are battling for a long time in the same areas. Right now it seems like C# has removed every reason why not to use C# instead of Java: it is cross platform nowadays, it is open source and it contains all new language features a developer wants. While the financial world is still dominated by Java, all other terrains show equal shares between Java and C#. Besides this, Microsoft is going strong and C# is still their most backed programming language.

Interesting note: C# has never been higher than Java in the TIOBE index. Currently the difference between the two rivals is less than 1%. There are exciting times ahead of us. Is C# going to surpass Java for the first time in the TIOBE index history?

"The fact that C# has been in the news for the successive betas and pre-release candidates prior to the release of C# 14 may have bumped up its percentage share in the last few months," notes a post on the site i-Programmer. But they also point out that by TIOBE's reckoning, Java — having been overtaken by Python in 2021 — "has been in decline ever since."

TechRepublic summarizes the rest of the Top Ten: JavaScript stays in sixth place at 3.42%, and Visual Basic edges up to seventh with 3.31%. Delphi/Object Pascal nudges upward to eighth at 2.06%, while Perl returns to the top 10 in ninth at 1.84% after a sharp year-over-year climb. SQL rounds out the list at tenth with 1.80%, maintaining a foothold that shows the enduring centrality of relational databases. Go, which held eighth place in October, slips out of the top 10 entirely.
Here's how TIOBE's methodology ranks programming language popularity in November:
  1. Python
  2. C
  3. C++
  4. Java
  5. C#
  6. JavaScript
  7. Visual Basic
  8. Delphi/Object Pascal
  9. Perl
  10. SQL

Python

Python Foundation Donations Surge After Rejecting Grant - But Sponsorships Still Needed (blogspot.com) 64

After the Python Software Foundation rejected a $1.5 million grant because it restricted DEI activity, "a flood of new donations followed," according to a new report. By Friday they'd raised over $157,000, including 295 new Supporting Members paying an annual $99 membership fee, says PSF executive director Deb Nicholson.

"It doesn't quite bridge the gap of $1.5 million, but it's incredibly impactful for us, both financially and in terms of feeling this strong groundswell of support from the community." Could that same security project still happen if new funding materializes? The PSF hasn't entirely given up. "The PSF is always looking for new opportunities to fund work benefiting the Python community," Nicholson told me in an email last week, adding pointedly that "we have received some helpful suggestions in response to our announcement that we will be pursuing." And even as things stand, the PSF sees itself as "always developing or implementing the latest technologies for protecting PyPI project maintainers and users from current threats," and it plans to continue with that commitment.
The Python Software Foundation was "astounded and deeply appreciative at the outpouring of solidarity in both words and actions," their executive director wrote in a new blog post this week, saying the show of support "reminds us of the community's strength."

But that post also acknowledges the reality that the Python Software Foundation's yearly revenue and assets (including contributions from major donors) "have declined, and costs have increased,..." Historically, PyCon US has been a source of revenue for the PSF, enabling us to fund programs like our currently paused Grants Program... Unfortunately, PyCon US has run at a loss for three years — and not from a lack of effort from our staff and volunteers! Everyone has been working very hard to find areas where we can trim costs, but even with those efforts, inflation continues to surge, and changing U.S. and economic conditions have reduced our attendance... Because we have so few expense categories (the vast majority of our spending goes to running PyCon US, the Grants Program, and our small 13-member staff), we have limited "levers to pull" when it comes to budgeting and long-term sustainability...
While Python usage continues to surge, "corporate investment back into the language and the community has declined overall. The PSF has longstanding sponsors and partners that we are ever grateful for, but signing on new corporate sponsors has slowed." (They're asking employees at Python-using companies to encourage sponsorships.) We have been seeking out alternate revenue channels to diversify our income, with some success and some challenges. PyPI Organizations offers paid features to companies (PyPI features are always free to community groups) and has begun bringing in monthly income. We've also been seeking out grant opportunities where we find good fits with our mission.... We currently have more than six months of runway (as opposed to our preferred 12 months+ of runway), so the PSF is not at immediate risk of having to make more dramatic changes, but we are on track to face difficult decisions if the situation doesn't shift in the next year.

Based on all of this, the PSF has been making changes and working on multiple fronts to combat losses and work to ensure financial sustainability, in order to continue protecting and serving the community in the long term. Some of these changes and efforts include:

— Pursuing new sponsors, specifically in the AI industry and the security sector
— Increasing sponsorship package pricing to match inflation
— Making adjustments to reduce PyCon US expenses
— Pursuing funding opportunities in the US and Europe
— Working with other organizations to raise awareness
— Strategic planning, to ensure we are maximizing our impact for the community while cultivating mission-aligned revenue channels

The PSF's end-of-year fundraiser effort is usually run by staff based on their capacity, but this year we have assembled a fundraising team that includes Board members to put some more "oomph" behind the campaign. We'll be doing our regular fundraising activities; we'll also be creating a unique webpage, piloting temporary and VERY visible pop-ups to python.org and PyPI.org, and telling more stories from our Grants Program recipients...

Keep your eyes on the PSF Blog, the PSF category on Discuss, and our social media accounts for updates and information as we kick off the fundraiser this month. Your boosts of our posts and your personal shares of "why I support the PSF" stories will make all the difference in our end-of-year fundraiser. If this post has you all fired up to personally support the future of Python and the PSF right now, we always welcome new PSF Supporting Members and donations.

AI

Magika 1.0 Goes Stable As Google Rebuilds Its File Detection Tool In Rust (googleblog.com) 26

BrianFagioli writes: Google has released Magika 1.0, a stable version of its AI-based file type detection tool, and rebuilt the entire engine in Rust for speed and memory safety. The system now recognizes more than 200 file types, up from about 100, and is better at distinguishing look-alike formats such as JSON vs JSONL, TSV vs CSV, C vs C++, and JavaScript vs TypeScript. The team used a 3TB training dataset and even relied on Gemini to generate synthetic samples for rare file types, allowing Magika to handle formats that don't have large, publicly available corpora. The tool supports Python and TypeScript integrations and offers a native Rust command-line client.

Under the hood, Magika uses ONNX Runtime for inference and Tokio for parallel processing, allowing it to scan around 1,000 files per second on a modern laptop core and scale further with more CPU cores. Google says this makes Magika suitable for security workflows, automated analysis pipelines, and general developer tooling. Installation is a single curl or PowerShell command, and the project remains fully open source.
The project is available on GitHub and documentation can be found here.
Hardware

Manufacturer Bricks Smart Vacuum After Engineer Blocks It From Collecting Data (tomshardware.com) 35

A curious engineer discovered that his iLife A11 smart vacuum was remotely "killed" after he blocked it from sending data to the manufacturer's servers. By reverse-engineering it with custom hardware and Python scripts, he managed to revive the device to run fully offline. Tom's Hardware reports: An engineer got curious about how his iLife A11 smart vacuum worked and monitored the network traffic coming from the device. That's when he noticed it was constantly sending logs and telemetry data to the manufacturer -- something he hadn't consented to. The user, Harishankar, decided to block the telemetry servers' IP addresses on his network, while keeping the firmware and OTA servers open. While his smart gadget worked for a while, it just refused to turn on soon after. After a lengthy investigation, he discovered that a remote kill command had been issued to his device.

He sent it to the service center multiple times, wherein the technicians would turn it on and see nothing wrong with the vacuum. When they returned it to him, it would work for a few days and then fail to boot again. After several rounds of back-and-forth, the service center probably got tired and just stopped accepting it, saying it was out of warranty. Because of this, he decided to disassemble the thing to determine what killed it and to see if he could get it working again. [...] So, why did the A11 work at the service center but refuse to run in his home? The technicians would reset the firmware on the smart vacuum, thus removing the kill code, and then connect it to an open network, making it run normally. But once it connected again to the network that had its telemetry servers blocked, it was bricked remotely because it couldn't communicate with the manufacturer's servers. Since he blocked the appliance's data collection capabilities, its maker decided to just kill it altogether.

"Someone -- or something -- had remotely issued a kill command," says Harishankar. "Whether it was intentional punishment or automated enforcement of 'compliance,' the result was the same: a consumer device had turned on its owner." In the end, the owner was able to run his vacuum fully locally without manufacturer control after all the tweaks he made. This helped him retake control of his data and make use of his $300 software-bricked smart device on his own terms. As for the rest of us who don't have the technical knowledge and time to follow his accomplishments, his advice is to "Never use your primary WiFi network for IoT devices" and to "Treat them as strangers in your home."

Privacy

Manufacturer Remotely Bricks Smart Vacuum After Its Owner Blocked It From Collecting Data (tomshardware.com) 123

"An engineer got curious about how his iLife A11 smart vacuum worked and monitored the network traffic coming from the device," writes Tom's Hardware.

"That's when he noticed it was constantly sending logs and telemetry data to the manufacturer — something he hadn't consented to." The user, Harishankar, decided to block the telemetry servers' IP addresses on his network, while keeping the firmware and OTA servers open. While his smart gadget worked for a while, it just refused to turn on soon after... He sent it to the service center multiple times, wherein the technicians would turn it on and see nothing wrong with the vacuum. When they returned it to him, it would work for a few days and then fail to boot again... [H]e decided to disassemble the thing to determine what killed it and to see if he could get it working again...

[He discovered] a GD32F103 microcontroller to manage its plethora of sensors, including Lidar, gyroscopes, and encoders. He created PCB connectors and wrote Python scripts to control them with a computer, presumably to test each piece individually and identify what went wrong. From there, he built a Raspberry Pi joystick to manually drive the vacuum, proving that there was nothing wrong with the hardware. From this, he looked at its software and operating system, and that's where he discovered the dark truth: his smart vacuum was a security nightmare and a black hole for his personal data.

First of all, it's Android Debug Bridge, which gives him full root access to the vacuum, wasn't protected by any kind of password or encryption. The manufacturer added a makeshift security protocol by omitting a crucial file, which caused it to disconnect soon after booting, but Harishankar easily bypassed it. He then discovered that it used Google Cartographer to build a live 3D map of his home. This isn't unusual, by far. After all, it's a smart vacuum, and it needs that data to navigate around his home. However, the concerning thing is that it was sending off all this data to the manufacturer's server. It makes sense for the device to send this data to the manufacturer, as its onboard SoC is nowhere near powerful enough to process all that data. However, it seems that iLife did not clear this with its customers.

Furthermore, the engineer made one disturbing discovery — deep in the logs of his non-functioning smart vacuum, he found a command with a timestamp that matched exactly the time the gadget stopped working. This was clearly a kill command, and after he reversed it and rebooted the appliance, it roared back to life.

Thanks to long-time Slashdot reader registrations_suck for sharing the article.
Programming

TypeScript Overtakes Python and JavaScript To Claim Top Spot on GitHub (github.blog) 38

TypeScript overtook Python and JavaScript in August 2025 to become the most used language on GitHub. The shift marked the most significant language change in more than a decade. The language grew by over 1 million contributors in 2025, a 66% increase year over year, and finished August with 2,636,006 monthly contributors.

Nearly every major frontend framework now scaffolds projects in TypeScript by default. Next.js 15, Astro 3, SvelteKit 2, Qwik, SolidStart, Angular 18, and Remix all generate TypeScript codebases when developers create new projects. Type systems reduce ambiguity and catch errors from large language models before production. A 2025 academic study found 94% of LLM-generated compilation errors were type-check failures. Tooling like Vite, ts-node, Bun, and I.D.E. autoconfig hide boilerplate setup. Among new repositories created in the past twelve months, TypeScript accounted for 5,394,256 projects. That represented a 78% increase from the prior year.
Python

Python Foundation Rejects Government Grant Over DEI Restrictions (theregister.com) 265

The Python Software Foundation rejected a $1.5 million U.S. government grant because it required them to renounce all diversity, equity, and inclusion initiatives. "The non-profit would've used the funding to help prevent supply chain attacks; create a new automated, proactive review process for new PyPI packages; and make the project's work easily transferable to other open-source package managers," reports The Register. From the report: The programming non-profit's deputy executive director Loren Crary said in a blog post today that the National Science Founation (NSF) had offered $1.5 million to address structural vulnerabilities in Python and the Python Package Index (PyPI), but the Foundation quickly became dispirited with the terms (PDF) of the grant it would have to follow. "These terms included affirming the statement that we 'do not, and will not during the term of this financial assistance award, operate any programs that advance or promote DEI [diversity, equity, and inclusion], or discriminatory equity ideology in violation of Federal anti-discrimination laws,'" Crary noted. "This restriction would apply not only to the security work directly funded by the grant, but to any and all activity of the PSF as a whole."

To make matters worse, the terms included a provision that if the PSF was found to have voilated that anti-DEI diktat, the NSF reserved the right to claw back any previously disbursed funds, Crary explained. "This would create a situation where money we'd already spent could be taken back, which would be an enormous, open-ended financial risk," the PSF director added. The PSF's mission statement enshrines a commitment to supporting and growing "a diverse and international community of Python programmers," and the Foundation ultimately decided it wasn't willing to compromise on that position, even for what would have been a solid financial boost for the organization. "The PSF is a relatively small organization, operating with an annual budget of around $5 million per year, with a staff of just 14," Crary added, noting that the $1.5 million would have been the largest grant the Foundation had ever received - but it wasn't worth it if the conditions were undermining the PSF's mission. The PSF board voted unanimously to withdraw its grant application.

AI

Slashdot Reader Mocks Databricks 'Context-Aware AI Assistant' for Odd Bar Chart 17

Long-time Slashdot reader theodp took a good look at the images on a promotional web page for Databricks' "context-aware AI assistant": If there was an AI Demo Hall of Shame, the first inductee would have to be Amazon. Their demo tried to support its CEO's claims that Amazon Q Code Transformation AI saved it 4,500 developer-years and an additional $260 million in "annualized efficiency gains" by automatically and accurately upgrading code to a more current version of Java. But it showcased a program that didn't even spell "Java" correctly. (It was instead called 'Jave')...

Today's nominee for the AI Demo Hall of Shame inductee is analytics platform Databricks for the NYC Taxi Trips Analysis it's been showcasing on its Data Science page since last November. Not only for its choice of a completely trivial case study that requires no 'Data Science' skills — find and display the ten most expensive and longest taxi rides — but also for the horrible AI-generated bar chart used to present the results of the simple ranking that deserves its own spot in the Graph Hall of Shame. In response to a prompt of "Now create a new bar chart with matplotlib for the most expensive trips," the Databricks AI Assistant dutifully complies with the ill-advised request, spewing out Python code to display the ten rides on a nonsensical bar chart whose continuous x-axis hides points sharing the same distance. (One might also question why no annotation is provided to call out or explain the 3 trips with a distance of 0 miles that are among the ten most expensive rides, with fares of $260, $188, and $105).

Looked at with a critical eye, these examples used to sell data scientists, educators, management, investors, and Wall Street on AI would likely raise eyebrows rather than impress their intended audiences.
AI

AI Slop? Not This Time. AI Tools Found 50 Real Bugs In cURL (theregister.com) 92

The Register reports: Over the past two years, the open source curl project has been flooded with bogus bug reports generated by AI models. The deluge prompted project maintainer Daniel Stenberg to publish several blog posts about the issue in an effort to convince bug bounty hunters to show some restraint and not waste contributors' time with invalid issues. Shoddy AI-generated bug reports have been a problem not just for curl, but also for the Python community, Open Collective, and the Mesa Project.

It turns out the problem is people rather than technology. Last month, the curl project received dozens of potential issues from Joshua Rogers, a security researcher based in Poland. Rogers identified assorted bugs and vulnerabilities with the help of various AI scanning tools. And his reports were not only valid but appreciated. Stenberg in a Mastodon post last month remarked, "Actually truly awesome findings." In his mailing list update last week, Stenberg said, "most of them were tiny mistakes and nits in ordinary static code analyzer style, but they were still mistakes that we are better off having addressed. Several of the found issues were quite impressive findings...."

Stenberg told The Register that about 50 bugfixes based on Rogers' reports have been merged. "In my view, this list of issues achieved with the help of AI tooling shows that AI can be used for good," he said in an email. "Powerful tools in the hand of a clever human is certainly a good combination. It always was...!" Rogers wrote up a summary of the AI vulnerability scanning tools he tested. He concluded that these tools — Almanax, Corgea, ZeroPath, Gecko, and Amplify — are capable of finding real vulnerabilities in complex code.

The Register's conclusion? AI tools "when applied with human intelligence by someone with meaningful domain experience, can be quite helpful."

jantangring (Slashdot reader #79,804) has published an article on Stenberg's new position, including recently published comments from Stenberg that "It really looks like these new tools are finding problems that none of the old, established tools detect."
Programming

Will AI Mean Bring an End to Top Programming Language Rankings? (ieee.org) 51

IEEE Spectrum ranks the popularity of programming languages — but is there a problem? Programmers "are turning away from many of these public expressions of interest. Rather than page through a book or search a website like Stack Exchange for answers to their questions, they'll chat with an LLM like Claude or ChatGPT in a private conversation." And with an AI assistant like Cursor helping to write code, the need to pose questions in the first place is significantly decreased. For example, across the total set of languages evaluated in the Top Programming Languages, the number of questions we saw posted per week on Stack Exchange in 2025 was just 22% of what it was in 2024...

However, an even more fundamental problem is looming in the wings... In the same way most developers today don't pay much attention to the instruction sets and other hardware idiosyncrasies of the CPUs that their code runs on, which language a program is vibe coded in ultimately becomes a minor detail... [T]he popularity of different computer languages could become as obscure a topic as the relative popularity of railway track gauges... But if an AI is soothing our irritations with today's languages, will any new ones ever reach the kind of critical mass needed to make an impact? Will the popularity of today's languages remain frozen in time?

That's ultimately the larger question. "how much abstraction and anti-foot-shooting structure will a sufficiently-advanced coding AI really need...?" [C]ould we get our AIs to go straight from prompt to an intermediate language that could be fed into the interpreter or compiler of our choice? Do we need high-level languages at all in that future? True, this would turn programs into inscrutable black boxes, but they could still be divided into modular testable units for sanity and quality checks. And instead of trying to read or maintain source code, programmers would just tweak their prompts and generate software afresh.

What's the role of the programmer in a future without source code? Architecture design and algorithm selection would remain vital skills... How should a piece of software be interfaced with a larger system? How should new hardware be exploited? In this scenario, computer science degrees, with their emphasis on fundamentals over the details of programming languages, rise in value over coding boot camps.

Will there be a Top Programming Language in 2026? Right now, programming is going through the biggest transformation since compilers broke onto the scene in the early 1950s. Even if the predictions that much of AI is a bubble about to burst come true, the thing about tech bubbles is that there's always some residual technology that survives. It's likely that using LLMs to write and assist with code is something that's going to stick. So we're going to be spending the next 12 months figuring out what popularity means in this new age, and what metrics might be useful to measure.

Having said that, IEEE Spectrum still ranks programming language popularity three ways — based on use among working programmers, demand from employers, and "trending" in the zeitgeist — using seven different metrics.

Their results? Among programmers, "we see that once again Python has the top spot, with the biggest change in the top five being JavaScript's drop from third place last year to sixth place this year. As JavaScript is often used to create web pages, and vibe coding is often used to create websites, this drop in the apparent popularity may be due to the effects of AI... In the 'Jobs' ranking, which looks exclusively at what skills employers are looking for, we see that Python has also taken 1st place, up from second place last year, though SQL expertise remains an incredibly valuable skill to have on your resume."
Government

Should Salesforce's Tableau Be Granted a Patent On 'Visualizing Hierarchical Data'? 72

Long-time Slashdot reader theodp says America's Patent and Trademark Office (USPTO) has granted a patent to Tableau (Salesforce's visual analytics platform) — for a patent covering "Data Processing For Visualizing Hierarchical Data": "A provided data model may include a tree specification that declares parent-child relationships between objects in the data model. In response to a query associated with objects in the data model: employing the parent-child relationships to determine a tree that includes parent objects and child objects from the objects based on the parent-child relationships; determining a root object based on the query and the tree; traversing the tree from the root object to visit the child objects in the tree; determining partial results based on characteristics of the visited child objects such that the partial results are stored in an intermediate table; and providing a response to the query that includes values based on the intermediate table and the partial results."

A set of 15 simple drawings is provided to support the legal and tech gobbledygook of the invention claims. A person can have a manager, Tableau explains in Figures 5-6 of its accompanying drawings, and that manager can also manage and be managed by other people. Not only that, Tableau illustrates in Figures 7-10 that computers can be used to count how many people report to a manager. How does this magic work, you ask? Well, you "generate [a] tree" [Fig. 13] and "traverse a tree" [Fig. 15], Tableau explains. But wait, there's more — you can also display the people who report to a manager in multi-level or nested pie charts (aka Sunburst charts), Tableau demonstrates in Fig. 11.

Interestingly, Tableau released a "pre-Beta" Sunburst chart type in late April 2023 but yanked it at the end of June 2023 (others have long-supported Sunburst charts, including Plotly). So, do you think Tableau should be awarded a patent in 2025 on a concept that has roots in circa-1921 Sunburst charts and tree algorithms taught to first-year CS students in circa-1975 Data Structures courses?
Power

Wind and Solar Will Power Datacenters More Cheaply Than Nuclear, Study Finds (theregister.com) 131

An anonymous reader quotes a report from The Register: Renewable energy sources could power datacenters at a lower cost than relying on nuclear generation from small modular reactors (SMRs), claims a recently revealed study. ... [A]nalysis from the Centre for Net Zero (CNZ) says it would cost 43 percent less to power a 120 MW data facility with renewables and a small amount of gas-generated energy, when compared with an SMR. It claims that a microgrid comprising offshore wind, solar, battery storage, and backed up by gas generation, would be significantly cheaper to run annually than procuring power sourced from a nuclear SMR.

[...] CNZ describes itself as an open research institute, founded by Octopus Energy Group in the UK, and claims to advise the State of California and Europe's International Energy Agency as well as the British government. While CNZ's study applies to the UK sector, where energy costs are among the highest in the industrialized world, it is likely that the overall conclusion would still be valid in other countries as well. Its analysis shows that renewables can meet 80 percent of the constant demand from a large datacenter over the course of a year. Offshore wind can provide the majority of load requirements, with gas generation backed by battery storage as a stopgap source of power representing the most cost-optimal mix.

Greater capacity in the on-site battery storage system would reduce the reliance on gas power, and this would likely happen over time as the cost of such systems is expected to come down, the report claims. But perhaps the real kicker is that CNZ estimates that microgrids powered largely by renewables could be built in approximately five years, while operational SMRs are not expected to be widely available until sometime in the next decade. CNZ says that it calculated the typical yearly resource cost (capex and opex) of powering a datacenter with a nuclear SMR, and modeled this using Python for Power System Analysis (PyPSA), an open source energy modeling tool, against two renewable energy scenarios. One was the wind, solar, battery, and gas mix, while the other omitted solar.

Operating Systems

Fedora Linux 43 Beta Released (nerds.xyz) 9

BrianFagioli shares a report from NERDS.xyz: The Fedora Project has announced Fedora Linux 43 Beta, giving users and developers the opportunity to test the distribution ahead of its final release. This beta introduces improvements across installation, system tools, and programming languages while continuing Fedora's pattern of cleaning out older components. The beta can be downloaded in Workstation, KDE Plasma, Server, IoT, and Cloud editions. Spins and Labs are also available, though Mate and i3 are not provided in some builds. Existing systems can be upgraded with DNF system-upgrade. Fedora CoreOS will follow one week later through its "next" stream. The beta brings enhancements to its Anaconda WebUI, moves to Python 3.14, and supports Wayland-only GNOME, among many other changes. A full list of improvements and system enhancements can be found here.

The official release should be available in late October or early November.
Perl

Is Perl the World's 10th Most Popular Programming Language? (i-programmer.info) 86

TIOBE attempts to calculate programming language popularity using the number of skilled engineers, courses, and third-party vendors.

And the eight most popular languages in September's rankings haven't changed since last month:

1. Python
2. C++
3. C
4. Java
5. C#
6. JavaScript
7. Visual Basic
8. Go

But by TIOBE's ranking, Perl is still the #10 most-popular programming in September (dropping from #9 in August). "One year ago Perl was at position 27 and now it suddenly pops up at position 10 again," marvels TIOBE CEO Paul Jansen. The technical reason why Perl is rated this high is because of its huge number of books on Amazon. It has 4 times more books listed than for instance PHP, or 7 times more books than Rust. The underlying "real" reason for Perl's increase of popularity is unknown to me. The only possibility I can think of is that Perl 5 is now gradually considered to become the real Perl... Perl 6/Raku is at position 129 of the TIOBE index, thus playing no role at all in the programming world. Perl 5 on the other hand is releasing more often recently, thus gaining attention.
An article at the i-Programmer blog thinks Perl's resurgence could be from its text processing capabilities: Even in this era of AI, everything is still governed by text formats; text is still the King. XML, JSON calling APIs, YAML, Markdown, Log files..That means that there's still need to process it, transform it, clean it, extract from it. Perl with its first-class-citizen regular expressions, the wealth of text manipulation libraries up on CPAN and its full Unicode support of all the latest standards, was and is still the best. Simply there's no other that can match Perl's text processing capabilities.
They also cite Perl's backing by the open source community, and its "getting a 'proper' OOP model in the last couple of years... People just don't know what Perl is capable of and instead prefer to be victims of FOMO ephemeral trends, chasing behind the new and shiny."

Perl creator Larry Wall answered questions from Slashdot's readers in 2016. So I'd be curious from Slashdot's readers about Perl today. (Share your experiences in the comments if you're still using Perl -- or Raku...)

Perl's drop to #9 means Delphi/Object Pascal rises up one rank, growing from 1.82% in August to 2.26% in September to claim September's #9 spot. "At number 11 and 1.86%, SQL is quite close to entering the top 10 again," notes TechRepublic. (SQL fell to #12 in June, which the site speculated was due to "the increased use of NoSQL databases for AI applications.")

But TechRepublic adds that the #1 most popular programming language (according to TIOBE) is still Python: Perl sits at 2.03% in TIOBE's proprietary ranking system in September, up from 0.64% in January. Last year, Perl held the 27th position... Python's unstoppable rise dipped slightly from 26.14% in August to 25.98% in September. Python is still well ahead of every other language on the index.
Python

New Python Documentary Released On YouTube (youtube.com) 46

"From a side project in Amsterdam to powering AI at the world's biggest companies — this is the story of Python," says the description of a new 84-minute documentary.

Long-time Slashdot reader destinyland writes: It traces Python all the way back to its origins in Amsterdam back in 1991. (Although the first time Guido van Rossum showed his new language to a co-worker, they'd typed one line of code just to prove they could crash Python's first interpreter.) The language slowly spread after van Rossum released it on Usenet — split across 21 separate posts — and Robin Friedrich, a NASA aerospace engineer, remembers using Python to build flight simulations for the Space Shuttle. (Friedrich says in the documentary he also attended Guido's first in-person U.S. workshop in 1994, and "I still have the t-shirt...")

Dropbox's CEO/founder Drew Houston describes what it was like being one of the first companies to use Python to build a company reaching millions of users. (Another success story was YouTube, which was built by a small team using Python before being acquired by Google). Anaconda co-founder Travis Oliphant remembers Python's popularity increasing even more thanks to the data science/macine learning community. But the documentary also includes the controversial move to Python 3 (which broke compatability with earlier versions). Though ironically, one of the people slogging through a massive code migration ended up being van Rossum himself at his new job at Dropbox. The documentary also includes van Rossum's resignation as "Benevolent Dictator for Life" after approving the walrus operator. (In van Rossum's words, he essentially "rage-quit over this issue.")

But the focus is on Python's community. At one point, various interviewees even take turns reciting passages from the "Zen of Python" — which to this day is still hidden in Python as an import-able library as a kind of Easter Egg.

"It was a massive undertaking", the documentary's director explains in a new interview, describing a full year of interviews. (The article features screenshots from the documentary — including a young Guido van Rossum and the original 1991 email that announced Python to the world.) [Director Bechtle] is part of a group that's filmed documentaries on everything from Kubernetes and Prometheus to Angular, Node.js, and Ruby on Rails... Originally part of the job platform Honeypot, the documentary-makers relaunched in April as Cult.Repo, promising they were "100% independent and more committed than ever to telling the human stories behind technology."
Honeypot's founder Emma Tracey bought back its 272,000-subscriber YouTube channel from Honeypot's new owners, New Work SE, and Cult.Repo now bills itself as "The home of Open Source documentaries."

Over in a thread at Python.org, language creator Guido van Rossum has identified the Python community members in the film's Monty Python-esque poster art. And core developer Hugo van Kemenade notes there's also a video from EuroPython with a 55-minute Q&A about the documentary.
Robotics

Florida Deploys Robot Rabbits To Control Invasive Burmese Python Population (cbsnews.com) 75

An anonymous reader quotes a report from CBS News: They look, move and even smell like the kind of furry Everglades marsh rabbit a Burmese python would love to eat. But these bunnies are robots meant to lure the giant invasive snakes out of their hiding spots. It's the latest effort by the South Florida Water Management District to eliminate as many pythons as possible from the Everglades, where they are decimating native species with their voracious appetites. In Everglades National Park, officials say the snakes have eliminated 95% of small mammals as well as thousands of birds. "Removing them is fairly simple. It's detection. We're having a really hard time finding them," said Mike Kirkland, lead invasive animal biologist for the water district. "They're so well camouflaged in the field."

The water district and University of Florida researchers deployed 120 robot rabbits this summer as an experiment. Previously, there was an effort to use live rabbits as snake lures but that became too expensive and time-consuming, Kirkland said. The robots are simple toy rabbits, but retrofitted to emit heat, a smell and to make natural movements to appear like any other regular rabbit. "They look like a real rabbit," Kirkland said. They are solar powered and can be switched on and off remotely. They are placed in small pens monitored by a video camera that sends out a signal when a python is nearby. "Then I can deploy one of our many contractors to go out and remove the python," Kirkland said. The total cost per robot rabbit is about $4,000, financed by the water district, he added.

Python

Survey Finds More Python Developers Like PostgreSQL, AI Coding Agents - and Rust for Packages (jetbrains.com) 85

More than 30,000 Python developers from around the world answered questions for the Python Software Foundation's annual survey — and PSF Fellow Michael Kennedy tells the Python community what they've learned in a new blog post. Some highlights: Most still use older Python versions despite benefits of newer releases... Many of us (15%) are running on the very latest released version of Python, but more likely than not, we're using a version a year old or older (83%). [Although less than 1% are using "Python 3.5 or lower".] The survey also indicates that many of us are using Docker and containers to execute our code, which makes this 83% or higher number even more surprising... You simply choose a newer runtime, and your code runs faster. CPython has been extremely good at backward compatibility. There's rarely significant effort involved in upgrading... [He calculates some cloud users are paying up to $420,000 and $5.6M more in compute costs.] If your company realizes you are burning an extra $0.4M-$5M a year because you haven't gotten around to spending the day it takes to upgrade, that'll be a tough conversation...

Rust is how we speed up Python now... The Python Language Summit of 2025 revealed that "Somewhere between one-quarter and one-third of all native code being uploaded to PyPI for new projects uses Rust", indicating that "people are choosing to start new projects using Rust". Looking into the survey results, we see that Rust usage grew from 27% to 33% for binary extensions to Python packages... [The blog post later advises Python developers to learn to read basic Rust, "not to replace Python, but to complement it," since Rust "is becoming increasingly important in the most significant portions of the Python ecosystem."]

PostgreSQL is the king of Python databases, and only it's growing, going from 43% to 49%. That's +14% year over year, which is remarkable for a 28-year-old open-source project... [E]very single database in the top six grew in usage year over year. This is likely another indicator that web development itself is growing again, as discussed above...

[N]early half of the respondents (49%) plan to try AI coding agents in the coming year. Program managers at major tech companies have stated that they almost cannot hire developers who don't embrace agentic AI. The productive delta between those using it and those who avoid it is simply too great (estimated at about 30% greater productivity with AI).

It's their eighth annual survey (conducted in collaboration with JetBrains last October and November). But even though Python is 34 years old, it's still evolving. "In just the past few months, we have seen two new high-performance typing tools released," notes the blog post. (The ty and Pyrefly typecheckers — both written in Rust.) And Python 3.14 will be the first version of Python to completely support free-threaded Python... Just last week, the steering council and core developers officially accepted this as a permanent part of the language and runtime... Developers and data scientists will have to think more carefully about threaded code with locks, race conditions, and the performance benefits that come with it. Package maintainers, especially those with native code extensions, may have to rewrite some of their code to support free-threaded Python so they themselves do not enter race conditions and deadlocks.

There is a massive upside to this as well. I'm currently writing this on the cheapest Apple Mac Mini M4. This computer comes with 10 CPU cores. That means until this change manifests in Python, the maximum performance I can get out of a single Python process is 10% of what my machine is actually capable of. Once free-threaded Python is fully part of the ecosystem, I should get much closer to maximum capacity with a standard Python program using threading and the async and await keywords.

Some other notable findings from the survey:
  • Data science is now over half of all Python. This year, 51% of all surveyed Python developers are involved in data exploration and processing, with pandas and NumPy being the tools most commonly used for this.
  • Exactly 50% of respondents have less than two years of professional coding experience! And 39% have less than two years of experience with Python (even in hobbyist or educational settings)...
  • "The survey tells us that one-third of devs contributed to open source. This manifests primarily as code and documentation/tutorial additions."

Python

Python Surges in Popularity. And So Does Perl (techrepublic.com) 80

Last month, Python "reached the highest ranking a programming language ever had in the TIOBE index," according to TIOBE CEO Paul Jansen.

"We thought Python couldn't grow any further, but AI code assistants let Python take yet another step forward." According to recent studies of Stanford University (Yegor Denisov-Blanch), AI code assistants such as Microsoft Copilot, Cursor or Google Gemini Code Assist are 20% more effective if used for popular programming languages. The reason for this is obvious: there is more code for these languages available to train the underlying models. This trend is visible in the TIOBE index as well, where we see a consolidation of languages at the top. Why would you start to learn a new obscure language for which no AI assistance is available? This is the modern way of saying that you don't want to learn a new language that is hardly documented and/or has too few libraries that can help you.
TIOBE's "Programming Community Index" attempts to calculate the popularity of languages using the number of skilled engineers, courses, and third-party vendors. It nows gives Python a 26.14% rating, which TechRepublic notes "is well ahead of the next two programming languages on this month's leaderboard: C++ is at 9.18% and C is 9.03%." But the first top six languages haven't changed since last year...
  1. Python
  2. C++
  3. C
  4. Java
  5. C#
  6. JavaScript

Since August of 2024 SQL has dropped from its #7 rank down to #12 (meaning Visual Basic and Go each rise up one rank from their position a year ago, into the #7 and #8 positions).

In the last year Perl has risen from the #25 position to #9, beating out Delphi/Oracle Pascal at #10, and Fortran at #11 (last year's #10). TIOBE CEO Jansen "told TechRepublic in an email that many people were asking why Perl was becoming more popular, but he didn't have a definitive answer. He said he double-checked the underlying data and found the increase to be accurate, though the reason for the shift remains unclear."


Python

How Python is Fighting Open Source's 'Phantom' Dependencies Problem (blogspot.com) 33

Since 2023 the Python Software Foundation has had a Security Developer-in-Residence (sponsored by the Open Source Security Foundation's vulnerability-finding "Alpha-Omega" project). And he's just published a new 11-page white paper about open source's "phantom dependencies" problem — suggesting a way to solve it.

"Phantom" dependencies aren't tracked with packaging metadata, manifests, or lock files, which makes them "not discoverable" by tools like vulnerability scanners or compliance and policy tools. So Python security developer-in-residence Seth Larson authored a recently-accepted Python Enhancement Proposal offering an easy way for packages to provide metadata through Software Bill-of-Materials (SBOMs). From the whitepaper: Python Enhancement Proposal 770 is backwards compatible and can be enabled by default by tools, meaning most projects won't need to manually opt in to begin generating valid PEP 770 SBOM metadata. Python is not the only software package ecosystem affected by the "Phantom Dependency" problem. The approach using SBOMs for metadata can be remixed and adopted by other packaging ecosystems looking to record ecosystem-agnostic software metadata...

Within Endor Labs' [2023 dependencies] report, Python is named as one of the most affected packaging ecosystems by the "Phantom Dependency" problem. There are multiple reasons that Python is particularly affected:

- There are many methods for interfacing Python with non-Python software, such as through the C-API or FFI. Python can "wrap" and expose an easy-to-use Python API for software written in other languages like C, C++, Rust, Fortran, Web Assembly, and more.

- Python is the premier language for scientific computing and artificial intelligence, meaning many high-performance libraries written in system languages need to be accessed from Python code.

- Finally, Python packages have a distribution type called a "wheel", which is essentially a zip file that is "installed" by being unzipped into a directory, meaning there is no compilation step allowed during installation. This is great for being able to inspect a package before installation, but it means that all compiled languages need to be pre-compiled into binaries before installation...


When designing a new package metadata standard, one of the top concerns is reducing the amount of effort required from the mostly volunteer maintainers of packaging tools and the thousands of projects being published to the Python Package Index... By defining PEP 770 SBOM metadata as using a directory of files, rather than a new metadata field, we were able to side-step all the implementation pain...

We'll be working to submit issues on popular open source SBOM and vulnerability scanning tools, and gradually, Phantom Dependencies will become less of an issue for the Python package ecosystem.

The white paper "details the approach, challenges, and insights into the creation and acceptance of PEP 770 and adopting Software Bill-of-Materials (SBOMs) to improve the measurability of Python packages," explains an announcement from the Python Software Foundation. And the white paper ends with a helpful note.

"Having spoken to other open source packaging ecosystem maintainers, we have come to learn that other ecosystems have similar issues with Phantom Dependencies. We welcome other packaging ecosystems to adopt Python's approach with PEP 770 and are willing to provide guidance on the implementation."

Slashdot Top Deals