Quantcast
Channel: oracle corporation | TechCrunch

YugaByte’s new database software rakes in $16 million so developers can move to any cloud

$
0
0

Looking to expand the footprint of its toolkit giving developers a unified database software that can work for both relational and post-relational databases, YugaByte has raised $16 million in a new round of funding.

For company co-founder and chief executive Kannan Muthukkaruppan, the new database software liberates developers from the risk of lock-in with any provider of cloud compute as the leading providers at Amazon, Microsoft and Google jockey for the pole position among software developers, and reduces programming complexity.

“YugaByte DB makes it possible for organizations to standardize on a single, distributed database to support a multitude of workloads requiring both SQL and NoSQL capabilities. This speeds up the development of applications while at the same time reduces operational complexity and licensing costs,” said Muthukkaruppan in a statement. 

Muthukkaruppan and his fellow co-founders know their way around database software. Alongside Karthik Ranganathan and Mikhail Bautin, Muthukkaruppan built the NoSQL platform that powered Facebook Messenger and its internal time series monitoring system. Before that Ranganathan and Muthukkaruppan had spent time working at Oracle . And after Facebook the two men were integral to the development of Nutanix’s hybrid infrastructure.

“These are tens of petabytes of data handling tens of millions of messages a day,” says Muthukkaruppan.

Ranganathan and Muthukkaruppan left Nutanix in 2016 to begin working on YugaByte’s database software. What’s important, founders and investors stress, is that YugaByte breaks any chains that would bind software developers to a single platform or provider.

While developers can move applications from one cloud provider to another, they have to maintain multiple databases across these systems so that they inter-operate.

“YugaByte’s value proposition is strong for both CIOs, who can avoid cloud vendor lock-in at the database layer, and for developers, who don’t have to re-architect existing applications because of YugaByte’s built-in native compatibility to popular NoSQL and SQL interfaces,” said Deepak Jeevankumar, a managing director at Dell Technologies Capital

Jeevankumar’s firm co-led the latest $16 million financing for YugaByte alongside previous investor Lightspeed Venture Partners.

What attracted Lightspeed and Dell’s new investment arm was the support the company has from engineers in the trenches, like Ian Andrews, the vice president of products at Pivotal. “YugaByte is going to be interesting to any enterprise requiring an elastic data tier for their cloud-native applications,” Andrews said in a statement. “Even more so if they have a requirement to operate across multiple clouds or in a Kubernetes environment.” 

With new software infrastructure, portability is critical, as data needs to move between and among different software architectures.

The problem is that traditional databases have a hard time scaling, and new database technologies aren’t incredibly reliable when it comes to data consistency and durability. So developers have been using legacy database software from folks like Oracle and PostgreSQL for their systems of record and then new database software like Microsoft Azure’s CosmosDB, Amazon’s DynamoDB, Apache’s Cassandra (which the fellas used at Facebook) or MongoDB for distributed transactions for applications (things like linear write/read scalability, plus auto-rebalancing, sharding and failover).

With YugaByte, software developers get support for Apache Cassandra and Redis APIs, along with support for PostgreSQL, which the company touts as the best of both the relational and post-relational database worlds.

Now that the company has $16 million more in the bank, it can begin spreading the word about the benefits of its new database software, says Muthukkaruppan.

“With the additional funding we will accelerate investments in engineering, sales and customer success to scale our support for enterprises looking to bring their business-critical data to the cloud,” he said in a statement. 


After 20 years of Salesforce, what Marc Benioff got right and wrong about the cloud

$
0
0

As we enter the twentieth year of Salesforce, there’s an interesting opportunity to reflect back on the change that Marc Benioff created with the software-as-a-service (SaaS) model for enterprise software with his launch of Salesforce.com.

This model has been validated by the annual revenue stream of SaaS companies, which is fast approaching $100 billion by most estimates, and it will likely continue to transform many slower-moving industries for years to come.

However, for the cornerstone market in IT — large enterprise-software deals — SaaS represents less than 25 percent of total revenue, according to most market estimates. This split is even evident in the most recent high-profile “SaaS” acquisition of GitHub by Microsoft, with more than 50 percent of GitHub’s revenue coming from the sale of their on-prem offering, GitHub Enterprise.  

Data privacy and security is also becoming a major issue, with Benioff himself even pushing for a U.S. privacy law on par with GDPR in the European Union. While consumer data is often the focus of such discussions, it’s worth remembering that SaaS providers store and process an incredible amount of personal data on behalf of their customers, and the content of that data goes well beyond email addresses for sales leads.

It’s time to reconsider the SaaS model in a modern context, integrating developments of the last nearly two decades so that enterprise software can reach its full potential. More specifically, we need to consider the impact of IaaS and “cloud-native computing” on enterprise software, and how they’re blurring the lines between SaaS and on-premises applications. As the world around enterprise software shifts and the tools for building it advance, do we really need such stark distinctions about what can run where?

Source: Getty Images/KTSDESIGN/SCIENCE PHOTO LIBRARY

The original cloud software thesis

In his book, Behind the Cloud, Benioff lays out four primary reasons for the introduction of the cloud-based SaaS model:

  1. Realigning vendor success with customer success by creating a subscription-based pricing model that grows with each customer’s usage (providing the opportunity to “land and expand”). Previously, software licenses often cost millions of dollars and were paid upfront, each year after which the customer was obligated to pay an additional 20 percent for support fees. This traditional pricing structure created significant financial barriers to adoption and made procurement painful and elongated.
  2. Putting software in the browser to kill the client-server enterprise software delivery experience. Benioff recognized that consumers were increasingly comfortable using websites to accomplish complex tasks. By utilizing the browser, Salesforce avoided the complex local client installation and allowed its software to be accessed anywhere, anytime and on any device.
  3. Sharing the cost of expensive compute resources across multiple customers by leveraging a multi-tenant architecture. This ensured that no individual customer needed to invest in expensive computing hardware required to run a given monolithic application. For context, in 1999, a gigabyte of RAM cost about $1,000 and a TB of disk storage was $30,000. Benioff cited a typical enterprise hardware purchase of $385,000 in order to run Siebel’s CRM product that might serve 200 end-users.
  4. Democratizing the availability of software by removing the installation, maintenance and upgrade challenges. Drawing from his background at Oracle, he cited experiences where it took 6-18 months to complete the installation process. Additionally, upgrades were notorious for their complexity and caused significant downtime for customers. Managing enterprise applications was a very manual process, generally with each IT org becoming the ops team executing a physical run-book for each application they purchased.

These arguments also happen to be, more or less, the same ones made by infrastructure-as-a-service (IaaS) providers such as Amazon Web Services during their early days in the mid-late ’00s. However, IaaS adds value at a layer deeper than SaaS, providing the raw building blocks rather than the end product. The result of their success in renting cloud computing, storage and network capacity has been many more SaaS applications than ever would have been possible if everybody had to follow the model Salesforce did several years earlier.

Suddenly able to access computing resources by the hour — and free from large upfront capital investments or having to manage complex customer installations — startups forsook software for SaaS in the name of economics, simplicity and much faster user growth.

Source: Getty Images

It’s a different IT world in 2018

Fast-forward to today, and in some ways it’s clear just how prescient Benioff was in pushing the world toward SaaS. Of the four reasons laid out above, Benioff nailed the first two:

  • Subscription is the right pricing model: The subscription pricing model for software has proven to be the most effective way to create customer and vendor success. Years ago already, stalwart products like Microsoft Office and the Adobe Suite successfully made the switch from the upfront model to thriving subscription businesses. Today, subscription pricing is the norm for many flavors of software and services.
  • Better user experience matters: Software accessed through the browser or thin, native mobile apps (leveraging the same APIs and delivered seamlessly through app stores) have long since become ubiquitous. The consumerization of IT was a real trend, and it has driven the habits from our personal lives into our business lives.

In other areas, however, things today look very different than they did back in 1999. In particular, Benioff’s other two primary reasons for embracing SaaS no longer seem so compelling. Ironically, IaaS economies of scale (especially once Google and Microsoft began competing with AWS in earnest) and software-development practices developed inside those “web scale” companies played major roles in spurring these changes:

  • Computing is now cheap: The cost of compute and storage have been driven down so dramatically that there are limited cost savings in shared resources. Today, a gigabyte of RAM is about $5 and a terabyte of disk storage is about $30 if you buy them directly. Cloud providers give away resources to small users and charge only pennies per hour for standard-sized instances. By comparison, at the same time that Salesforce was founded, Google was running on its first data center — with combined total compute and RAM comparable to that of a single iPhone X. That is not a joke.
  • Installing software is now much easier: The process of installing and upgrading modern software has become automated with the emergence of continuous integration and deployment (CI/CD) and configuration-management tools. With the rapid adoption of containers and microservices, cloud-native infrastructure has become the de facto standard for local development and is becoming the standard for far more reliable, resilient and scalable cloud deployment. Enterprise software packed as a set of Docker containers orchestrated by Kubernetes or Docker Swarm, for example, can be installed pretty much anywhere and be live in minutes.

Source: Getty Images/ERHUI1979

What Benioff didn’t foresee

Several other factors have also emerged in the last few years that beg the question of whether the traditional definition of SaaS can really be the only one going forward. Here, too, there’s irony in the fact that many of the forces pushing software back toward self-hosting and management can be traced directly to the success of SaaS itself, and cloud computing in general:

  1. Cloud computing can now be “private”: Virtual private clouds (VPCs) in the IaaS world allow enterprises to maintain root control of the OS, while outsourcing the physical management of machines to providers like Google, DigitalOcean, Microsoft, Packet or AWS. This allows enterprises (like Capital One) to relinquish hardware management and the headache it often entails, but retain control over networks, software and data. It is also far easier for enterprises to get the necessary assurance for the security posture of Amazon, Microsoft and Google than it is to get the same level of assurance for each of the tens of thousands of possible SaaS vendors in the world.
  2. Regulations can penalize centralized services: One of the underappreciated consequences of Edward Snowden’s leaks, as well as an awakening to the sometimes questionable data-privacy practices of companies like Facebook, is an uptick in governments and enterprises trying to protect themselves and their citizens from prying eyes. Using applications hosted in another country or managed by a third party exposes enterprises to a litany of legal issues. The European Union’s GDPR law, for example, exposes SaaS companies to more potential liability with each piece of EU-citizen data they store, and puts enterprises on the hook for how their SaaS providers manage data.
  3. Data breach exposure is higher than ever: A corollary to the point above is the increased exposure to cybercrime that companies face as they build out their SaaS footprints. All it takes is one employee at a SaaS provider clicking on the wrong link or installing the wrong Chrome extension to expose that provider’s customers’ data to criminals. If the average large enterprise uses 1,000+ SaaS applications and each of those vendors averages 250 employees, that’s an additional 250,000 possible points of entry for an attacker.
  4. Applications are much more portable: The SaaS revolution has resulted in software vendors developing their applications to be cloud-first, but they’re now building those applications using technologies (such as containers) that can help replicate the deployment of those applications onto any infrastructure. This shift to what’s called cloud-native computing means that the same complex applications you can sign up to use in a multi-tenant cloud environment also can be deployed into a private data center or VPC much easier than previously possible. Companies like BigID, StackRox, Dashbase and others are taking a private cloud-native-instance-first approach to their application offerings. Meanwhile SaaS stalwarts like Atlassian, Box, GitHub and many others are transitioning over to Kubernetes driven, cloud-native architectures that provide this optionality in the future.  
  5. The script got flipped on CIOs: Individuals and small teams within large companies now drive software adoption by selecting the tools (e.g. GitHub, Slack, HipChat, Dropbox), often SaaS, that best meet their needs. Once they learn what’s being used and how it’s working, CIOs are faced with the decision to either restrict network access to shadow IT or pursue an enterprise license — or the nearest thing to one — for those services. This trend has been so impactful that it spawned an entirely new category called cloud access security brokers — another vendor that needs to be paid, an additional layer of complexity, and another avenue for potential problems. Managing local versions of these applications brings control back to the CIO and CISO.

Source: Getty Images/MIKIEKWOODS

The future of software is location agnostic

As the pace of technological disruption picks up, the previous generation of SaaS companies is facing a future similar to the legacy software providers they once displaced. From mainframes up through cloud-native (and even serverless) computing, the goal for CIOs has always been to strike the right balance between cost, capabilities, control and flexibility. Cloud-native computing, which encompasses a wide variety of IT facets and often emphasizes open-source software, is poised to deliver on these benefits in a manner that can adapt to new trends as they emerge.

The problem for many of today’s largest SaaS vendors is that they were founded and scaled out during the pre-cloud-native era, meaning they’re burdened by some serious technical and cultural debt. If they fail to make the necessary transition, they’ll be disrupted by a new generation of SaaS companies (and possibly traditional software vendors) that are agnostic toward where their applications are deployed and who applies the pre-built automation that simplifies management. This next generation of vendors will put more control in the hands of end customers (who crave control), while maintaining what vendors have come to love about cloud-native development and cloud-based resources.

So, yes, Marc Benioff and Salesforce were absolutely right to champion the “No Software” movement over the past two decades, because the model of enterprise software they targeted needed to be destroyed. In the process, however, Salesforce helped spur a cloud computing movement that would eventually rewrite the rules on enterprise IT and, now, SaaS itself.

Announcing TechCrunch Sessions: Enterprise this September in San Francisco

$
0
0

Of the many categories in the tech world, none is more ferociously competitive than enterprise. For decades, SAP, Oracle, Adobe, Microsoft, IBM and Salesforce, to name a few of the giants, have battled to deliver the tools businesses want to become more productive and competitive. That market is closing in on $500 billion in sales per year, which explains why hundreds of new enterprise startups launch every year and dozens are acquired by the big incumbents trying to maintain their edge.

Last year alone, the top 10 enterprise acquisitions were worth $87 billion and included IBM’s acquiring Red Hat for $34 billion, SAP paying $8 billion for Qualtrics, Microsoft landing GitHub for $7.5 billion, Salesforce acquiring MuleSoft for $6.5 billion and Adobe grabbing Marketo for $4.75 billion. No startup category has made more VCs and founders wildly wealthy, and none has seen more mighty companies rise faster or fall harder. That technology and business thrill ride makes enterprise a category TechCrunch has long wanted to tackle head on.

TC Sessions: Enterprise (September 5 at San Francisco’s Yerba Buena Center) will take on the big challenges and promise facing enterprise companies today. TechCrunch’s editors, notably Frederic Lardinois, Ron Miller and Connie Loizos, will bring to the stage founders and leaders from established and emerging companies to address rising questions like the promised revolution from machine learning and AI, intelligent marketing automation and the inevitability of the cloud, as well as the outer reaches of technology, like quantum and blockchain.

We’ll enlist proven enterprise-focused VCs to reveal where they are directing their early, middle and late-stage investments. And we’ll ask the most proven serial entrepreneurs to tell us what it really took to build that company, and which company they would like to create next. All throughout the show, TechCrunch’s editors will zero in on emerging enterprise technologies to sort the hype from the reality. Whether you are a founder, an investor, enterprise-minded engineer or a corporate CTO / CIO, TC Sessions: Enterprise will provide a valuable day of new insights and great networking.

Tickets are now available for purchase on our website at the early-bird rate of $249. Want to bring a group of people from your company? Get an automatic 20% savings when you purchase four or more tickets at once. Are you an early-stage startup? We have a limited number of Startup Demo Packages available for $2,000, which includes four tickets to attend the event. Students are invited to apply for a reduced-price student ticket at just $75. Additionally, for each ticket purchased for TC Sessions: Enterprise, you will also be registered for a complimentary Expo Only pass to TechCrunch Disrupt SF on October 2-4.

Interested in sponsoring TC Sessions: Enterprise? Fill out this form and a member of our sales team will contact you.

Huawei launches AI-backed database to target enterprise customers

$
0
0

China’s Huawei is making a serious foray into the enterprise business market after it unveiled a new database management product on Wednesday, putting it in direct competition with entrenched vendors like IBM, Oracle and Microsoft.

The Shenzhen-based company, best known for making smartphones and telecom equipment, claims its newly minted database uses artificial intelligence capabilities to improve tuning performance, a process that traditionally involves human administrators, by more than 60%.

Called the GaussDB, the database works both locally as well as on public and private clouds. When running on Huawei’s own cloud, GaussDB provides data warehouse services for customers across the board, from financial, logistics and education to automotive industries.

The database launch was first reported by The Information on Tuesday, citing sources saying it is designed by the company’s secretive database research group called Gauss and will initially focus on the Chinese market.

The announcement comes at a time when Huawei’s core telecom business is drawing scrutiny in the West over the company’s alleged ties to the Chinese government. That segment accounted for 40.8% of Huawei’s total revenues in 2018, according to financial details released by the privately held firm.

Huawei’s consumer unit, which is driven by its fast-growing smartphone and device sales, made up almost half of the company’s annual revenues. Enterprise businesses made up less than a quarter of earnings, but Huawei’s new push into database management is set to add new fuel to the segment.

Meanwhile, at Oracle, more than 900 employees, most of whom worked for its 1,600-staff research and development center in China, were recently let go amid a major company restructuring, multiple media outlets reported earlier this month.

Data provided to TechCrunch by Boss Zhipin offers clues to the layoff: The Chinese recruiting platform has recently seen a surge in newly registered users who work at Oracle China. But the door is still open for new candidates as the American giant is currently recruiting for more than 100 positions through Boss, including many related to cloud computing.

Microsoft and Oracle link up their clouds

$
0
0

Microsoft and Oracle announced a new alliance today that will see the two companies directly connect their clouds over a direct network connection so that their users can then move workloads and data seamlessly between the two. This alliance goes a bit beyond just basic direct connectivity and also includes identity interoperability.

This kind of alliance is relatively unusual between what are essentially competing clouds, but while Oracle wants to be seen as a major player in this space, it also realizes that it isn’t likely to get to the size of an AWS, Azure or Google Cloud anytime soon. For Oracle, this alliance means that its users can run services like the Oracle E-Business Suite and Oracle JD Edwards on Azure while still using an Oracle database in the Oracle cloud, for example. With that, Microsoft still gets to run the workloads and Oracle gets to do what it does best (though Azure users will also continue be able to run their Oracle databases in the Azure cloud, too).

“The Oracle Cloud offers a complete suite of integrated applications for sales, service, marketing, human resources, finance, supply chain and manufacturing, plus highly automated and secure Generation 2 infrastructure featuring the Oracle Autonomous Database,” said Don Johnson, executive vice president, Oracle Cloud Infrastructure (OCI), in today’s announcement. “Oracle and Microsoft have served enterprise customer needs for decades. With this alliance, our joint customers can migrate their entire set of existing applications to the cloud without having to re-architect anything, preserving the large investments they have already made.”

For now, the direct interconnect between the two clouds is limited to Azure US East and Oracle’s Ashburn data center. The two companies plan to expand this alliance to other regions in the future, though they remain mum on the details. It’ll support applications like JD Edwards EnterpriseOne, E-Business Suite, PeopleSoft, Oracle Retail and Hyperion on Azure, in combination with Oracle databases like RAC, Exadata and the Oracle Autonomous Database running in the Oracle Cloud.

“As the cloud of choice for the enterprise, with over 95% of the Fortune 500 using Azure, we have always been first and foremost focused on helping our customers thrive on their digital transformation journeys,” said Scott Guthrie, executive vice president of Microsoft’s Cloud and AI division. “With Oracle’s enterprise expertise, this alliance is a natural choice for us as we help our joint customers accelerate the migration of enterprise applications and databases to the public cloud.”

Today’s announcement also fits within a wider trend at Microsoft, which has recently started building a number of alliances with other large enterprise players, including its open data alliance with SAP and Adobe, as well as a somewhat unorthodox gaming partnership with Sony.

 

Google Cloud launches Bare Metal Solution

$
0
0

Google Cloud today announced the launch of a new bare metal service, dubbed the Bare Metal Solution. We aren’t talking about bare metal servers offered directly by Google Cloud here, though. Instead, we’re talking about a solution that enterprises can use to run their specialized workloads on certified hardware that’s co-located in the Google Cloud data centers and directly connect them to Google Cloud’s suite of other services. The main workload that makes sense for this kind of setup is databases, Google notes, and specifically Oracle Database.

Bare Metal Solution is, as the name implies, a fully integrated and fully managed solution for setting up this kind of infrastructure. It involves a completely managed hardware infrastructure that includes servers and the rest of the data center facilities like power and cooling; support contracts with Google Cloud and billing are handled through Google’s systems, as well as an SLA. The software that’s deployed on those machines is managed by the customer — not Google.

The overall idea, though, is clearly to make it easier for enterprises with specialized workloads that can’t easily be migrated to the cloud to still benefit from the cloud-based services that need access to the data from these systems. Machine learning is an obvious example, but Google also notes that this provides these companies with a bridge to slowly modernize their tech infrastructure in general (where “modernize” tends to mean “move to the cloud”).

“These specialized workloads often require certified hardware and complicated licensing and support agreements,” Google writes. “This solution provides a path to modernize your application infrastructure landscape, while maintaining your existing investments and architecture. With Bare Metal Solution, you can bring your specialized workloads to Google Cloud, allowing you access and integration with GCP services with minimal latency.”

Because this service is co-located with Google Cloud, there are no separate ingress and egress charges for data that moves between Bare Metal Solution and Google Cloud in the same region.

The servers for this solution, which are certified to run a wide range of applications (including Oracle Database) range from dual-socket 16-core systems with 384 GB of RAM to quad-socket servers with 112 cores and 3072 GB of RAM. Pricing is on a monthly basis, with a preferred term length of 36 months.

Obviously, this isn’t the kind of solution that you self-provision, so the only way to get started — and get pricing information — is to talk to Google’s sales team. But this is clearly the kind of service that we should expect from Google Cloud, which is heavily focused on providing as many enterprise-ready services as possible.

Google brings IBM Power Systems to its cloud

$
0
0

As Google Cloud looks to convince more enterprises to move to its platform, it needs to be able to give businesses an onramp for their existing legacy infrastructure and workloads that they can’t easily replace or move to the cloud. A lot of those workloads run on IBM Power Systems with their Power processors, and, until now, IBM was essentially the only vendor that offered cloud-based Power systems. Now, however, Google is also getting into this game by partnering with IBM to launch IBM Power Systems on Google Cloud.

Update: Seattle-based Skytap also offers support for IBM Power systems and makes them available in its own cloud, as well as Azure and IBM Cloud.

“Enterprises looking to the cloud to modernize their existing infrastructure and streamline their business processes have many options,” writes Kevin Ichhpurani, Google Cloud’s corporate VP for its global ecosystem, in today’s announcement. “At one end of the spectrum, some organizations are re-platforming entire legacy systems to adopt the cloud. Many others, however, want to continue leveraging their existing infrastructure while still benefiting from the cloud’s flexible consumption model, scalability, and new advancements in areas like artificial intelligence, machine learning, and analytics.”

Power Systems support obviously fits in well here, given that many companies use them for mission-critical workloads based on SAP and Oracle applications and databases. With this, they can take those workloads and slowly move them to the cloud, without having to re-engineer their applications and infrastructure. Power Systems on Google Cloud is obviously integrated with Google’s services and billing tools.

This is very much an enterprise offering, without a published pricing sheet. Chances are, given the cost of a Power-based server, you’re not looking at a bargain, per-minute price here.

Because IBM has its own cloud offering, it’s a bit odd to see it work with Google to bring its servers to a competing cloud — though it surely wants to sell more Power servers. The move makes perfect sense for Google Cloud, though, which is on a mission to bring more enterprise workloads to its platform. Any roadblock the company can remove works in its favor, and, as enterprises get comfortable with its platform, they’ll likely bring other workloads to it over time.

Salesforce co-CEO Keith Block steps down

$
0
0

Salesforce today announced that Keith Block, the company’s co-CEO, is stepping down. This leaves company founder Marc Benioff as the sole CEO and chair of the CRM juggernaut. Block’s bio has already been wiped from Salesforce’s leadership page.

Block stepped into the co-CEO role in 2018, after a long career at the company that saw him become vice chairman, president and director before he took this position. Block spent the early years of his career at Oracle. He left there in 2012 after the release of a number of documents in which he criticized then-Oracle CEO Mark Hurd, who passed away last year.

Industry pundits saw his elevation to the co-CEO role as a sign that Block was next in line as the company’s sole CEO in the future (assuming Benioff would ever step down). After this short tenure as co-CEO, it doesn’t look like that will be the case, but for the time being, Block will stay on as an advisor to Benioff.

“It’s been my greatest honor to lead the team with Marc [Benioff] that has more than quadrupled Salesforce from $4 billion of revenue when I joined in 2013 to over $17 billion last year,” said Block in a canned statement that was surely not written by the Salesforce PR team. “We are now a global enterprise company, focused on industries, and have an ecosystem that is the envy of the industry, and I’m so grateful to our employees, customers, and partners. After a fantastic run I am ready for my next chapter and will stay close to the company as an advisor. Being side-by-side with Marc has been amazing and I’m forever grateful for our friendship and proud of the trajectory the company is on.”

In related news, the company also today announced that it has named former BT Group CEO Gavin Patterson as its president and CEO of Salesforce International.


Equity Monday: Quibi, two Boston rounds and a shift to pessimism

$
0
0

Hello and welcome back to Equity, TechCrunch’s venture capital-focused podcast, where we unpack the numbers behind the headlines. This is Equity Monday, our short-form week-starter in which we go over the weekend, look to the week ahead, talk about some neat funding rounds and dig into what is stuck on our minds.

So, by section then:

The weekend:

  • The market narrative seems to have changed from optimism to pessimism, impacting stock prices and possibly closing the IPO window some, after it had unexpectedly opened.
  • Quibi news is out that isn’t great: The mobile-first launch that came during a lockdown hasn’t helped the hugely funded service that had to convince the world that its content format was great. We calculate its effective cost-per-subscriber number and it isn’t super great.

The week ahead:

  • Earnings from Groupon and Oracle. The former could tell us a little bit about the health of the consumer perhaps? And Oracle is a player in the cloud space, so its earnings might help us understand what’s up in that world. See, not everything cloud-related comes from Seattle.
  • And we note the grip of tech conferences that were put on hold due to COVID-19, wondering what they might look like next year; do we ever go back to the way that things used to be?

Funding rounds:

What’s on our minds:

Equity drops every Friday at 6:00 am PT, so subscribe to us on Apple PodcastsOvercastSpotify and all the casts.

Standing by developers through Google v. Oracle

$
0
0

The Supreme Court will hear arguments tomorrow in Google v. Oracle. This case raises a fundamental question for software developers and the open-source community: Whether copyright may prevent developers from using software’s functional interfaces — known as APIs — to advance innovation in software. The court should say no — free and open APIs protect innovation, competition and job mobility for software developers in America.

When we use an interface, we don’t need to understand (or care) about how the function on the other side of the interface is performed. It just works. When you sit down at your computer, the QWERTY keyboard allows you to rapidly put words on the screen. When you submit an online payment to a vendor, you are certain the funds will appear in the vendor’s account. It just works.

In the software world, interfaces between software programs are called “application programming interfaces” or APIs. APIs date back to the 1950s and allow developers to write programs that reuse other program functionality without knowing how that functionality is performed. If your program needs to sort a list, you could have it use a sorting program’s API to sort the list for your program. It just works.

Developers have historically used software interfaces free of copyright concerns, and this freedom has accelerated innovation, software interoperation and developer job mobility. Developers using existing APIs save time and effort, allowing those savings to be refocused on new ideas. Developers can also reimplement APIs from one software platform to others, enabling innovation to flow freely across software platforms.

Importantly, reusing APIs gives developers job portability, since knowledge of one set of APIs is more applicable cross-industry. The upcoming Google v. Oracle decision could change this, harming developers, open-source software and the entire software industry.

Google v. Oracle and the platform API bargain

Google v. Oracle is the culmination of a decade-long dispute. Back in 2010, Oracle sued Google, arguing that Google’s Android operating system infringed Oracle’s rights in Java. After ten years, the dispute now boils down to whether Google’s reuse of Java APIs in Android was copyright infringement.

Prior to this case, most everyone assumed that copyright did not cover the use of functional software like APIs. Under that assumption, competing platforms’ API reimplementation allowed developers to build new yet familiar things according to the API bargain: Everyone could use the API to build applications and platforms that interoperate with each other. Adhering to the API made things “just work.”

But if the Google v. Oracle decision indicates that API reimplementation requires copyright permission, the bargain falls apart. Nothing “just works” unless platform makers say so; they now dictate rules for interoperability — charging developers huge prices for the platform or stopping rival, compatible platforms from being built.

Free and open APIs are essential for modern developers

If APIs are not free and open, platform creators can stop competing platforms from using compatible APIs. This lack of competition blocks platform innovation and harms developers who cannot as easily transfer their skills from project to project, job to job.

MySQL, Oracle’s popular database, reimplemented mSQL’s APIs so third-party applications for mSQL could be “ported easily” to MySQL. If copyright had restricted reimplementation of those APIs, adoption of MySQL, reusability of old mSQL programs and the expansion achieved by the “LAMP” stack would have been stifled, and the whole ecosystem would be poorer for it. This and other examples of API reimplementation — IBM’s BIOS, Windows and WINE, UNIX and Linux, Windows and WSL, .NET and Mono, have driven perhaps the most amazing innovation in human history, with open-source software becoming critical digital infrastructure for the world.

Similarly, a copyright block on API-compatible implementations puts developers at the mercy of platform makers say so — both for their skills and their programs. Once a program is written for a given set of APIs, that program is locked-in to the platform unless those APIs can also be used on other software platforms. And once a developer learns skills for how to use a given API, it’s much easier to reuse than retrain on APIs for another platform. If the platform creator decides to charge outrageous fees, or end platform support, the developer is stuck. For nondevelopers, imagine this: The QWERTY layout is copyrighted and the copyright owner decided to charge $1,000 dollars per keyboard. You would have a choice: Retrain your hands or pay up.

All software used by anyone was created by developers. We should give developers the right to freely reimplement APIs, as developer ability to shift applications and skills between software ecosystems benefits everyone — we all get better software to accomplish more.

I hope that the Supreme Court’s decision will pay heed to what developer experience has shown: Free and open APIs promote freedom, competition, innovation and collaboration in tech.

Salto raises $27M to let you configure your SaaS platforms with code

$
0
0

Salto, a Tel Aviv-based open-source startup that allows you to configure SaaS platforms like Salesforce, NetSuite and HubSpot with code, is coming out of stealth today and announced that it has raised a $27 million Series A round. This round was led by Bessemer Venture Partners, Lightspeed Venture Partners and Salesforce Ventures.

The general idea here — which is similar to the “infrastructure-as-code” movement — is to allow business operations teams to automate the labor-intensive and error-prone ways they currently use to manage SaaS platforms. While others in this space are betting on no-code solutions for managing these systems, Salto is going the other way and is betting on code instead.

“We realized the challenges BizOps teams face are very similar to the problems encountered by software and DevOps engineers on a daily basis,” writes Salto co-founder and CEO Rami Tamir in today’s announcement. “So we adapted software development fundamentals and best practices to the BizOps field. There’s no need to reinvent the wheel; the same techniques used to make high-quality software can also be applied to keeping control over business applications.”

Image Credits: Salto

Salto makes the core of its service available as open source. This open-source version includes the company’s NaCI language, a declarative configuration language based on the syntax of HashiCorp’s hcl, a command-line interface for deploying configuration changes (and fetching the current configuration state of an application) and a VS Code extension.

In combination with Git, business operations teams can collaborate on writing these configurations and test them in staging environments. The company is essentially taking modern software development practices and applying them to business operations.

Image Credits: Salto

“Defining a company’s business logic as code can make a fundamental change in the way business applications are delivered,” writes Tamir. “We like to think about it as ‘company-as-code,’ much in the same way as ‘infrastructure-as-code’ transformed the way we manage data centers.”

Some of the use cases here are configuring custom Salesforce CPQ fields, and syncing profiles across Salesforce environments and maintaining audio logs for NetSuite. For now, the company only supports connections to Salesforce, HubSpot and NetSuite, with others following soon.

Like other open-source companies, Salto’s business model involved selling a hosted version of its service, which the company is also announcing today.

In terms of raising this new round, it surely helped that the founding team, which includes Benny Schnaider and Gil Hoffer, in addition to Tamir, previously sold the three companies they founded. Pentacom was acquired by Cisco earlier this year; Oracle acquired Ravello Systems in 2016 and Qumranet was acquired by Red Hat in 2008.

“Business agility is more important than ever today, and the alignment of external business services to real business needs is increasing in strategic importance,” said Alex Kayyal, partner and head of International at Salesforce Ventures. “BizOps teams are becoming more and more crucial to the success of companies. With Salto they are empowered to meet the tasks they are charged with, equipped with modernized methodologies and a greatly enhanced toolbox.”

TikTok’s forced sale to Oracle is put on hold

$
0
0

The insane saga of a potential forced sale of TikTok’s U.S. operations is reportedly ending — another victim of the transition to methodical and rational policymaking that appears to be the boring new normal under the presidency of Joe Biden.

Last fall, the U.S. government under President Donald Trump took a stab at “gangster capitalism” by trying to force the sale of TikTok to a group of buyers including Oracle and Walmart.

While the effort was doomed from the start, with TikTok’s parent company ByteDance winning most of the legal challenges to the government effort, a Rubicon had effectively been crossed where the U.S. government appeared willing to spend political capital to stymie the growth of a successful foreign business on its shores for the flimsiest of security reasons.

Now, The Wall Street Journal is reporting that the efforts by the U.S. government to push the deal forward “have been shelved indefinitely,” citing sources familiar with the process.

However, discussions between TikTok and U.S. national security officials are continuing because there are valid concerns around TikTok’s data collection and the potential for manipulation and censorship of content on the app.

In the meantime, the U.S. is taking a look at all of the potential threats to data privacy and security from intrusions by foreign governments or using tech developed overseas, according to Emily Horne, the spokeswoman for the National Security Council.

“We plan to develop a comprehensive approach to securing U.S. data that addresses the full range of threats we face,” Horne told the WSJ. “This includes the risk posed by Chinese apps and other software that operate in the U.S. In the coming months, we expect to review specific cases in light of a comprehensive understanding of the risks we face.”

Last year, then-President Trump ordered a ban on TikTok, intending to force the sale of the Chinese-owned, short-form video distribution service to a U.S.-owned investment group.

As part of that process, the Committee on Foreign Investment in the U.S. ordered ByteDance to divest of its U.S. operations. TikTok appealed that order in court in Washington last November as the U.S. was roiled by the presidential election and its aftermath.

That case is still pending, but separate federal court rulings have blocked the U.S. government from shutting down TikTok.

Months later, we’re still making sense of the Supreme Court’s API copyright ruling

$
0
0

APIs, or application programming interfaces, make the digital world go round. Working behind the scenes to define the parameters by which software applications communicate with each other, APIs underpin every kind of app — social media, news and weather, financial, maps, video conferencing, you name it. They are critically important to virtually every enterprise organization and industry worldwide.

Given APIs’ ubiquity and importance, it’s understandable that all industry eyes were on the U.S. Supreme Court’s April 5 ruling in Google LLC v. Oracle America Inc., an 11-year-old case that addressed two core questions: Whether copyright protection extends to an API, and whether use of an API in the context of creating a new computer program constitutes fair use. Google lawyers had called it “the copyright case of the decade.”

I was one of 83 computer scientists — including five Turing Award winners and four National Medal of Technology honorees — who signed a Supreme Court amicus brief stating their opposition to the assertion that APIs are copyrightable, while also supporting Google’s right to fair use under the current legal definition.

To be clear: APIs should be free of copyright, no ifs, ands or buts.

We explained that the freedom to reimplement and extend existing APIs has been critical to technological innovation by ensuring competitors could challenge established players and advance the state of the art. “Excluding APIs from copyright protection has been essential to the development of modern computers and the internet,” the brief said.

The Supreme Court ruling was a mixed bag that many observers are still parsing. In a 6-2 decision, justices sided with Google and its argument that the company’s copying of 11,500 lines of code from Oracle’s Java in the Android operating system was fair use. Great! At the same time, though, the court appeared to be operating under the assumption that APIs are copyrightable.

“Given the rapidly changing technological, economic and business-related circumstances, we believe we should not answer more than is necessary to resolve the parties’ dispute,” Justice Stephen Breyer wrote for the majority. “We shall assume, but purely for argument’s sake, that [the code] “falls within the definition of that which can be copyrighted.”

While it may take years to fully understand the ruling’s impact, it’s important to keep dissecting the issue now, as APIs only continue to become more essential as the pipes behind every internet-connected device and application.

The legal saga began when Google used Java APIs in developing Android. Google wrote its own implementation of the Java APIs, but in order to allow developers to write their own programs for Android, Google’s implementation used the same names, organization and functionality as the Java APIs.

Oracle sued Google in U.S. District Court for the Northern District of California in August 2010, seven months after it closed its acquisition of Java creator Sun Microsystems, contending that Google had infringed Oracle’s copyright.

In May 2012, Judge William Alsup ruled that APIs are not subject to copyright because that would hamper innovation. Oracle appealed the ruling to the U.S. Court of Appeals, which reversed Judge Alsup in May 2014, finding that the Java APIs are copyrightable. However, he also sent the case back to the trial court to determine whether Google has a fair use defense.

A new District Court trial began in May 2016 on the fair use question. A jury found that Google’s implementation of the Java API was fair use. Oracle appealed, and the U.S. Court of Appeals in March 2018 again reversed the lower court. Google filed a petition with the Supreme Court in January 2019, receiving a hearing date in early 2020. However, lengthening the case’s torturous path through the courts even further, COVID-19 forced oral arguments to be postponed to last October. Finally, on April 5, the Supreme Court settled the matter.

Or did it?

“Supreme Court Leaves as Many Questions as It Answers in Google v. Oracle,” read a headline on law.com. The National Law Review said: “The Supreme Court sidestepped the fundamental IP issue — whether or not Oracle’s software code at the heart of the case is copyrightable.”

On one hand, I’m disappointed that the court’s ruling left even a hint of ambiguity about whether APIs are copyrightable. To be clear: APIs should be free of copyright, no ifs, ands or buts.

APIs provide structure, sequence and organization for digital resources in the same way that a restaurant menu does for food. Imagine if Restaurant A, which serves burgers, fries and shakes, couldn’t use the same words, as well as the ordering and organization of the words, on their menu as Restaurant B. A menu doesn’t represent a novel expression; rather, it is the ingredients, processes and service that define a restaurant. Both burger places benefit from the shared concept of a menu and the shared knowledge among their consumers of what burgers, fries and shakes are. It is the execution of the menu that ultimately will set one restaurant apart from another.

Likewise, APIs are not intellectual property; they are simply operational elements that are common, reusable, remixable, and able to be put into use in as many applications by as many developers as possible.

This pattern plays out over and over across many different sectors of our economy where APIs are being used, reused and remixed to generate new kinds of applications, integrations or entirely new companies and products or services. Immense value is generated by the free, collective, collaborative and open evolution of APIs.

On the other hand, I’m pleased by the part of the Supreme Court ruling that widens the definition of fair use. I think that provides the scope needed to take the industry into its API future without too much friction.

I also believe the case will chill future attempts by other companies to engage in litigation over API copyright. In the end, the decade-long Google vs. Oracle case negatively affected Oracle’s image when it comes to the fast-growing API sector, and I suspect other companies will think twice before going to court.

Nevertheless, companies may want to be extra cautious about licensing their APIs using the widest possible license, applying a Creative Commons CC0 or CC BY to APIs built with tolls and specifications, such as Swagger, OpenAPI and AsyncAPI.

Now that Google vs. Oracle is finally history, I feel that the API sector will remain as vibrant as ever. That’s excellent news for everybody.

EU clears Microsoft-Nuance without conditions

$
0
0

The European Union’s competition regulator has given the all-clear to Microsoft’s $19.7 billion purchase of transcription tech firm Nuance, which was announced earlier this year.

The EU said today it has concluded there are no competition concerns for the region if the acquisition goes ahead, clearing it without conditions.

The deal was notified to the Commission’s regulators on November 16.

While Microsoft-Nuance has gotten a green light from the bloc, the U.K.’s Competition and Markets Authority has just opened its own preliminary investigation — so some regional scrutiny continues.

On the EU side, the Commission’s investigation looked at horizontal overlaps between Nuance and Microsoft in the markets for transcription software — finding the two provide very different products (out-of-the-box software for end-users versus APIs targeted at devs wanting to add speech recognition tech to their apps, respectively).

It also determined that the combined entity will continue to face “strong” competition from other players.

The EU looked at the vertical link between Microsoft’s cloud computing services and Nuance’s downstream transcription software for healthcare, too — but found competing transcription service providers in the sector do not depend on Microsoft for cloud computing.

Nor are these type of transcription service providers particularly key users of cloud computing, per the Commission.

Its investigation considered conglomerate links between Nuance’s software (which is only available for Windows) and a number of Microsoft products as well — but took the view that the combined entity would not have the ability and/or incentive to foreclose competitors in the markets for (healthcare) transcription software; enterprise communication services; CRM software; productivity software; and PC operating systems.

And, there again, the EU decided the combined entity will still face strong competition.

Perhaps most interestingly, the Commission looked at the use of data transcribed by Nuance’s software.

This is interesting because healthcare data is highly sensitive and Microsoft — while not a massive player in adtech — does have ambitions to grow that side of its business. Just today it announced the purchase of adtech firm Xandr (formerly AppNexus), from AT&T, to beef up its digital advertising biz, for example.

Add to that, in recent days, Oracle — a tech giant with an already massive digital marketing biz — has signalled its own grand designs on health — announcing the acquisition of Cerner, a U.S. provider of electronic health record (EHR) systems.

And, well, the prospect of adtech companies harvesting health data makes plenty of people queasy about privacy.

Nonetheless, the Commission’s assessment of the data side of the Microsoft-Nuance gave it a clean bill of health — owing to existing “contractual restrictions” and regional data protection regulations.

While this analysis is largely coming from a competition perspective, it’s notable that data protection got (another) shout-out in an EU antitrust assessment. (The earlier instance relates to the EU’s [ongoing] scrutiny of Google’s adtech; which followed much criticism from privacy advocates after the Commission cleared Google’s purchase of Fitbit last year, albeit — in that case — with conditions that included limits on Google using Fitbit health data for ads.)

“The Commission concluded that Nuance can use this data only to provide its services,” the EU writes in a press release on the clearance of Microsoft-Nuance. “It is not used by any other company and cannot be used for any other purpose due to contractual restrictions and data protection legislation.”

The EU’s antitrust division also concluded that access to Nuance’s data will not provide Microsoft with an advantage that would enable it to shut out competing healthcare software providers — given “important transcribed information is typically stored in third-party applications like electronic health record (EHR) systems that combine data from several sources, as opposed to Nuance’s fragmented speech data”.

That tidbit raises questions about whether Oracle’s acquisition of Cerner, a provider of EHR systems, may face more probing questions from EU competition regulators — if/when they come to consider that Big Tech healthcare sector deal.

Although Cerner’s relative lack of regional customers — it sold off some of its European portfolio last year — may serve to reduce or limit the scope of any EU concerns.

Department of Labor sues Oracle over discriminatory pay and hiring practices

$
0
0

The U.S. Department of Labor has sued Oracle for discriminatory employment practices, the government body announced on Wednesday. The Department of Labor specifically states that the company has “a systemic practice” in place of paying white male workers more than others in the same role, including “women, African American and Asian employees.”

The suit also alleges that Oracle favours Asian workers in its recruitment efforts for technical and product roles, which resulted in discrimination in its hiring practices against non-Asian job seekers applying for those roles. The suit is being brought by the U.S. Department of Labor because it is a federal contractor, providing services, software and hardware to the U.S. government. Federal contractors are required to maintain equitable and fair hiring practices. The full complaint from the Department of Labor is embedded below.

For its part, Oracle denies the claims. The company provided the following statement to TechCrunch via a spokesperson:

The complaint is politically motivated, based on false allegations, and wholly without merit. Oracle values diversity and inclusion, and is a responsible equal opportunity and affirmative action employer. Our hiring and pay decisions are non-discriminatory and made based on legitimate business factors including experience and merit.

Oracle’s claim that the suit is “politically motivated” might be an oblique reference to the appointment of its co-CEO Safra Catz to Donald Trump’s presidential transition team, which was announced in December.

Should Oracle be found to have committed wrongdoing in this case, it could face the cancellation of all current government contracts, as well as a ban on receiving future contracts from the federal government.

U.S. Department of Labor complaint v. Oracle, January 18, 2017. by TechCrunch on Scribd

Department of Labor sues Oracle over discriminatory pay and hiring practices by Darrell Etherington originally published on TechCrunch


Oracle breaks with tech industry in backing human trafficking bill

$
0
0

Oracle is one of the few in the tech industry backing a bipartisan bill to hold websites facilitating human trafficking legally accountable.

The Stop Enabling Sex Traffickers Act, sponsored by Senator Richard Blumenthal (D-CT) and Senator Robert Portman (R-OH), would amend a part of a 90’s era law (section 230) currently protecting social networking sites and online platforms such as Google and Facebook from being held legally liable for content shared by those on the site.

Classified ads site Backpage is currently using this shield to protect itself from being sued for third-party content on its site allegedly aiding illegal prostitution and selling children for sex.

However, the opposition fears the bill would open tech companies up to endless lawsuits and stifle digital innovation.

In August, a coalition of 10 trade associations and lobbying groups representing the tech industry sent a letter to Senators Blumenthal and Portman, telling them section 230 would “severely undermine a crucial protection for legitimate online companies, and would be counterproductive to those companies’ efforts to combat trafficking crimes.”

The next day, Google lobbyist Stewart Jeffries wrote a letter warning the Stop Enabling Sex Traffickers Act would “seriously jeopardize the internet ecosystem.”

Baffled, Oracle senior vice president Kenneth Glueck wrote his own letter to senators Portman and Blumenthal.

“Your legislation does not, as suggested by the bill’s opponents, usher the end of the internet,” Glueck wrote. “If enacted, it will establish some measure of accountability for those that cynically sell advertising but are unprepared to help curtail sex trafficking.”

It should be noted, Oracle works in the cloud services space and, unlike Google and Facebook, does not deal in third-party content that would leave it open to lawsuits under the proposed amendment to the law.

Those in favor of the bill, which seems to have more than a quarter of the support in the Senate and is backed by 100 co-sponsors in the House, believe tech giants should be helping to end child sex trafficking, not digging in their heels. They argue these companies have the resources to combat this issue and that the bill is narrow enough in scope to not leave the them open to endless legal woes.

We are pleased with the growing support for the bipartisan Stop Enabling Sex Traffickers Act, and welcome Oracle’s important voice to this effort,” Senator Portman wrote. “It is an acknowledgement that this simple, bipartisan bill is the right prescription for fixing a fundamental flaw in the law that has enabled online sex traffickers to escape justice.  It’s time for Congress to act on this bipartisan bill.” 

Oracle breaks with tech industry in backing human trafficking bill by Sarah Buhr originally published on TechCrunch

Google Cloud launches AlloyDB, a new fully managed PostgreSQL database service

$
0
0

Google today announced the launch of AlloyDB, a new fully managed PostgreSQL-compatible database service that the company claims to be twice as fast for transactional workloads as AWS’s comparable Aurora PostgreSQL (and four times faster than standard PostgreSQL for the same workloads and up to 100 times faster for analytical queries).

If you’re deep into the Google Cloud ecosystem, then a fully managed PostgreSQL database service may sound familiar. The company, after all, already offers CloudSQL for PostgreSQL and Spanner, Google Cloud’s fully managed relational database service also offers a PostgreSQL interface. But these are services that offer an interface that is compatible with PostgreSQL to allow developers with these skills to use these services. AlloyDB is the standard PostgreSQL database at its core, though the team did modify the kernel to allow it to use Google’s infrastructure to its fullest, all while allowing the team to stay up to date with new versions as they launch.

Image Credits: Google

Andi Gutmans, who joined Google as its GM and VP of Engineering for its database products in 2020 after a long stint at AWS, told me that one of the reasons the company is launching this new product is that while Google has done well in helping enterprise customers move their MySQL and PostgreSQL servers to the cloud with the help of services like CloudSQL, the company didn’t necessarily have the right offerings for those customers who wanted to move their legacy databases (Gutmans didn’t explicitly say so, but I think you can safely insert “Oracle” here) to an open source service.

“There are different reasons for that,” he told me. “First, they are actually using more than one cloud provider, so they want to have the flexibility to run everywhere. There are a lot of unfriendly licensing gimmicks, traditionally. Customers really, really hate that and, I would say, whereas probably two to three years ago, customers were just complaining about it, what I notice now is customers are really willing to invest resources to just get off these legacy databases. They are sick of being strapped and locked in.”

Add to that Postgres’ rise to becoming somewhat of a de facto standard for relational open source databases (and MySQL’s decline) and it becomes clear why Google decided that it wanted to be able to offer a dedicated high-performance PostgreSQL service.

Image Credits: Google

Gutmans also noted that a lot of Google’s customers now want to use their relational databases for analytics use cases, so the team spent quite a lot of effort on making Postgres perform better for these users. Given Gutmans’ background at AWS, where he was the engineering owner for a number of AWS analytics services, that’s probably no surprise.

“When I joined AWS, it was an opportunity to stay in the developer space but really work on databases,” he explained.  “That’s when I worked on things like graph databases and [Amazon] ElastiCache and, of course, got the opportunity to see how important and critical data is to customers. [ … ] That kind of audience was really a developer audience primarily, because that’s developers using databases to build their apps. Then I went into the analytics space at AWS, and I kind of discovered the other side of it. On one hand, the folks I was talking to were not necessarily developers anymore — a lot of them were on the business side or analysts — but I also then saw that these worlds are really converging.” These users wanted to get real-time insights from their data, run fraud detection algorithms over it or do real-time personalization or inventory management at scale.

Image Credits: Goolge

On the technical side, the AlloyDB team built on top of Google’s existing infrastructure, which disaggregates compute and storage. That’s the same infrastructure layer that runs Spanner, BigQuery and essentially all of Google’s services. This, Gutmans argued, already gives the service a leg up over its competition, in addition to the fact that AlloyDB specifically focuses on PostgreSQL and nothing else. “You don’t always get to optimize as much when you have to support more than one [database engine and query language]. We decided that what enterprises are asking us for [is] Postgres for these legacy database migrations, so let’s just do the best in Postgres.”

The changes the team made to the Postgres kernel, for example, now allow it to scale the system linearly to over 64 virtual cores while on the analytical side, the team built a custom machine learning-based caching service to learn a customer’s access patterns and then convert Postgres’ row format into an in-memory columnar format that can be analyzed significantly faster.

"Read

Google Cloud launches AlloyDB, a new fully managed PostgreSQL database service by Frederic Lardinois originally published on TechCrunch

FCC Commissioner writes to Apple and Google about removing TikTok

$
0
0

An FCC Commissioner, Brendan Carr, wrote to Apple and Google on Tuesday, requesting the companies remove TikTok from their app stores for “its pattern of surreptitious data practices.” This comes after BuzzFeed News reported last week that TikTok’s staff in China had access to U.S.-based users’ data up until January.

“As you know TikTok is an app that is available to millions of Americans through your app stores, and it collects vast troves of sensitive data about those U.S. users. TikTok is owned by Beijing-based ByteDance — an organization that is beholden to the Communist Party of China and required by the Chinese law to comply with PRC’s surveillance demands,” Carr said in a letter addressed to Sundar Pichai and Tim Cook.

“It is clear that TikTok poses an unacceptable national security risk due to its extensive data harvesting being combined with Beijing’s apparently unchecked access to that sensitive data.”

After BuzzFeed News published its report, TikTok quickly went on the defense and announced that it is moving all U.S. users’ data to Oracle servers situated in the country. It specified that the company still uses its own U.S. and Singapore-based servers for backup. But in the future, it expects to “delete U.S. users’ private data from our own data centers and fully pivot to Oracle cloud servers located in the U.S.”

“We’re also making operational changes in line with this work — including the new department we recently established, with U.S.-based leadership, to solely manage U.S. user data for TikTok,” the company added.

TikTok’s user data practices have come under suspicion many times. In 2020, India banned TikTok over national security concerns, and both former President Donald Trump and the current president Joe Biden have raised questions about the short video app’s relations with China and how it affects U.S. users’ data. While Trump proposed an outright ban on TikTok or an option of selling its U.S. business to a local buyer, Biden proposed new rules that will give more oversight on apps with ties to “jurisdiction of foreign adversaries” that may pose national security risks.

Apple and Google didn’t comment on the story.

Update July 1, 10 AM IST: Updated the story to reflect that TechCrunch didn’t receive any comments from Apple and Google.

FCC Commissioner writes to Apple and Google about removing TikTok by Ivan Mehta originally published on TechCrunch

Bluechip, an African systems integrator with partners like Microsoft and Oracle, is expanding to Europe

$
0
0

It’s not often you hear about African tech companies expanding into Europe. Some examples include fintechs Lidya and Korapay in Eastern Europe and the U.K., respectively. In the latest development, Bluechip Technologies, an African enterprise company that partners with international OEMs like Microsoft and Oracle and provides data warehousing solutions and enterprise applications to banks, telcos and insurance firms, is announcing its European launch. 

The Nigeria-based systems integrator said the strategic expansion positions it as a “new competitive entrant in the EU market offering data warehousing and analytics products as well as highly experienced senior data engineers from its Nigeria team as consultants for European firms.” 

Olumide Soyombo, one of Nigeria’s high-profile angel investors who launched Voltron Capital last year, started Bluechip Technologies in 2008 with Kazeem Tewogbade. The company specializes in data warehousing, analytics and enterprise systems for banks and telcos. Having launched with a ₦5 million (~$30,000 at the time) seed investment from Soyombo’s father, Bluechip Technologies has grown to employ nearly 200 consultants and expanded across other African markets such as Kenya, DRC Congo, Zambia and Ghana. Some of its clients have pan-African and global reach, including FirstBank, MTN, 9mobile, Lafarge, GTBank and Access Bank. 

Bluechip’s data warehouse product collates data from disparate sources, translating them into information that gets businesses to understand trends such as customer lifetime value, churn and business analytics on gathered data. Telcos also use its simplex voucher management system to create airtime vouchers. 

With its recently launched Primo Academy, a pipeline six-month program of data professionals (sort of like an Andela-esque model) for itself, local and international partners, Bluechip Technologies is also one of the few African tech companies focusing on training and placing data professionals. 

According to Soyombo, the post-pandemic trend in remote working, a critical shortage of tech talent and an increase in demand for managing data more efficiently present a great opportunity for his company to deliver specialized services in Europe (recent research projects the region’s big data and business analytics market size to hit $105 billion+ by 2027). Also, the company, having delivered — in partnership with international OEMs — a range of enterprise tech infrastructure solutions in the African market, thinks it can do the same in Europe and plans to target the telco and banking sector from its Ireland base. 

“We built this core enterprise business application for banks and telcos and the talent pool to address these needs. The whole play here is to be that systems integrator provider to the EU market. The pandemic has accelerated the need for that global flat workspace, and how to place those engineers while working with our partners like Oracle and Microsoft, and to do this cheaper than India or Eastern Europe,” said Soyombo. 

Richard Lewis will lead the European expansion as CEO of Bluechip EU Subsidiary. He was the CEO of Business Logic Systems, a Bluechip partner based in the U.K. In 2017, Business Logic was acquired by Ireland-based Evolving Systems, a provider of software for connected mobile devices to over 100 network operators across 60+ countries; Lewis was the company’s senior vice president of global sales until this year. His experience will prove vital in placing Bluechip’s data engineers and IP-packaged products (including its newly launched customer data management and cash management solutions) with European partners.

“Richard has a good feel of the market. He has seen some of the initial requirements from customers that can make him say, “hey, if this is how what you’re paying for a developer in India, we can give you an equally quality developer for 20-30% less this price. And that’s the target that we’re pursuing,” Soyombo said.

Richard Lewis (CEO, Bluechip Technologies EU). Image Credits: Bluechip Technologies

Bluechip’s growth over the past decade has almost mirrored the development of the African tech ecosystem and similar businesses in the same time frame despite the company not being venture-backed (its business is such that VC money isn’t necessarily required to scale). For instance, when Andela launched in 2014, it only had physical hubs in Nigeria, Kenya, Rwanda and Uganda to source, vet and train engineers to be part of remote teams for international companies. However, after going fully remote, the unicorn saw a 750% increase in applicants outside Africa as it expanded to over 80 countries. 

Ultimately, Bluechip, which operates the Andela-esque model for one of its services, plans to become a legacy multinational information technology services and consulting company like India’s Tata Consulting and Tech Mahindra. In 2014, the company clocked about $5 million in revenue. Last year, it generated almost $50 million. With its pan-African and global expansion plans, Soyombo predicts that the company’s revenues might hit $250 million in five years. “We want to try it out on the EU market and see how it works. The plan is also to expand further elsewhere like French-speaking Africa and possibly North America,” said the co-founder and investor.

Bluechip, an African systems integrator with partners like Microsoft and Oracle, is expanding to Europe by Tage Kene-Okafor originally published on TechCrunch

Oracle now monitoring TikTok’s algorithms and moderation system for manipulation by China’s government

$
0
0

Oracle has begun auditing TikTok’s algorithms and content moderation models, according to a new report from Axios out this morning. Those reviews began last week, and follow TikTok’s June announcement it had moved its U.S. traffic to Oracle servers amid claims its U.S. user data had been accessed by TikTok colleagues in China.

The new arrangement is meant to allow Oracle the ability to monitor TikTok’s systems to help the company in its efforts to assure U.S. lawmakers that its app is not being manipulated by Chinese government authorities. Oracle will audit how TikTok’s algorithm surfaces content to “ensure outcomes are in line with expectations,” and that those models have not been manipulated, the report said. In addition, TikTok will regularly audit TikTok’s content moderation practices, including both its automated systems and its moderation decisions where people are choosing how to enforce TikTok policy.

TikTok confirmed to TechCrunch Axios’ reporting is accurate.

TikTok’s moderation policies have been controversial in years past. In 2019, The Washington Post reported TikTok’s U.S. employees had often been ordered to restrict some videos on its platform at the behest of Beijing-based teams, and that teams in China would sometimes block or penalize certain videos out of caution about Chinese government restrictions. That same year, The Guardian also reported TikTok had been telling its moderators to censor videos that mentioned things like Tiananmen Square, Tibetan independence or the banned religious group Falun Gong, per a set of leaked documents. In 2020, The Intercept reported TikTok moderators were told to censor political speech in livestreams and to suppress posts from “undesirable users” — the unattractive, poor or disabled, its documents said.

All the while, TikTok disputed the various claims — calling leaked documents outdated, for instance, in the latter two scenarios. It also continued to insist that its U.S. arm didn’t take instructions from its Chinese parent, ByteDance.

But a damning June 2022 report by BuzzFeed News proved that TikTok’s connection to China was closer than it had said. The news outlet found that U.S. data had been repeatedly accessed by staff in China, citing recordings from 80 TikTok internal meetings.

Following BuzzFeed’s reporting, TikTok announced that it was moving all U.S. traffic to Oracle’s infrastructure cloud service — a move designed to keep TikTok’s U.S. user data from prying eyes.

That agreement, part of a larger operation called “Project Texas,” had been in progress for over a year and was focused on further separating TikTok’s U.S. operations from China, and employing an outside firm to oversee its algorithms.

Now, it seems Oracle is in charge of keeping an eye on TikTok to help prevent data emanating from the U.S. from being directed to China. The deal steps up Oracle’s involvement with TikTok as not only the host for the user data, but an auditor who could later back up or dispute TikTok’s claims that its system is operating fairly and without China’s influence. 

Oracle and TikTok have an interesting history. Toward the end of the Trump administration, the former president tried to force a sale between the two companies, bringing in long-time supporter, Oracle founder and CTO Larry Ellison to help broker the deal for his company. That deal eventually fell apart in February 2021, but the story didn’t end there, as it turned out.

While this new TikTok-Oracle agreement has significance in terms of the tech industry and in politics, Oracle’s deal with TikTok doesn’t necessarily make the firm a more powerful player in the cloud infrastructure market.

Even with TikTok’s business, Oracle’s cloud infrastructure service represents just a fraction of the cloud infrastructure market. In the most recent quarter, Synergy Research, a firm that tracks this data, reported the cloud infrastructure market reached almost $55 billion, with Amazon leading the way with 34%, Microsoft in second with 21% and Google in third place with 10%. Oracle remains under 2%, says John Dinsdale, who is a principal analyst at the firm.

“Oracle’s share of the worldwide cloud infrastructure services market remains at just below 2% and has shown no signs of meaningful increase. So Oracle’s cloud revenue growth is pretty much keeping pace with overall market growth,” Dinsdale told TechCrunch. Synergy defines “cloud infrastructure services” as Infrastructure as a Service, Platform as a Service and hosted private cloud services. Dinsdale points out that Oracle’s SaaS business is much stronger.”

Update, 8/16/22, 1PM ET: TikTok confirmed to TC the report from Axios is accurate.  

Oracle now monitoring TikTok’s algorithms and moderation system for manipulation by China’s government by Sarah Perez originally published on TechCrunch





Latest Images