Snowflake developer & builder tools showcased at 2023 Summit
Snowflake used its Snowflake Summit 2023 event in Las Vegas this June to showcase its latest augmentations, extensions and enhancements now featuring across its builder toolset for software application development professionals.
Kicking off the ‘builder’ keynote, Benoit Dageville, president of products at Snowflake took the audience on a tour of the Snowflake platform.
“Our platform supports all data types (structured, semi-structured or unstructured) and all users core use of the Snowflake platform means they can access data from multiple sources (whether internal or external) and the Snowflake platform is compatible with all data formats,” said Dageville. “The Snowflake platform implements many optimisations across the spectrum (such as serverless capabilities) and the out-of-the-box performance that Snowflake offers makes our core technology proposition extremely cost-effective.”
Dageville talked about Snowpark Container Services, which makes Snowpark so much more powerful in his view. Snowpipe streaming, Snowpark Mlapi and others join the news related to Snowflake Native Apps, which (the company insists) all make a big difference to the way developers are able to bundle application logic into one package that can be listed on the Snowflake Marketplace.
Complete encapsulation
Focusing on Snowpack Container Services, this is the ability to containerise whole apps and services (that are typically pre-existing) and then run ‘completely encapsulated’ inside of Snowflake.
It is essentially Snowflake’s version of or (take on) virtualising workloads. This is a really big advancement for the compsny because it literally allows anything and everything to run on the Snowflake platform, which of course also enables users to take advantage of Snowflake’s core engine and governance framework.
The governance aspect of the Snowflake Data Cloud means that any user can see exactly what data source or data model or data function they are plugging into securely and be able to understand risks related to vulnerability, risks of exfiltration and so on. The company says that its technology is a protective framework, so it can be trusted and
sanctioned.
Once an app is published on the marketplace, it can be accessed by other users directly on their Snowflake account. The company is also working hard to make all Snowflake apps on the marketplace available to users in all world regions. Dageville says that his team is focused on building a strong developer community and the firm has been hosting community events around the globe as well as online forums.
With Snowflake Labs, the company explains that it has open sourced many tools and also worked to help drive skills-based initiatives to enable and increase developer competencies with its platform across the full ecosystem that the firm operates in.
All types of data
Dageville’s comments related to ‘all data’ resonated with the same messages offered by Snowflake CEO Frank Slootman in the previous days’ keynote.
“The Snowflake Data Cloud is not just about structured data, although that is a big part of enterprise data. All data that has analytical value needs to be part of [the way organisations now look to work with] it. We’ve been old hands at processing semi-structured data. When we added support for unstructured data, we also needed a way to derive structure from it so it could be referenced for analytical purposes. Think about invoices, purchase orders, legal docs,” commented Slootman, in the day 1 keynote.
Back in day 2, the company also suggests that Snowpark for Python has exploded in the last six months in terms of usage, interest and adoption.
Dageville and Slootman assert that if developers are not looking at Snowpark for data engineering, they might just be missing out. The company now has ‘hundreds’ of Proof of Concept (POC) demos and it says that it is seeing a two to four times performance improvement in order of magnitude cost savings. The company also insists that this technology is safer, simpler and easier than legacy Spark, Hadoop etc.
Technical ‘tasty byte’ demos
In a series of extended technical demonstrations (and staged user scenario skits) Snowflake employees showcased working examples of how to use Snowflake Snowflake shared procedures among a selection of other related core technologies.
With so many teams needing to scale their workloads once initially developed (but still being able to hold on to data structure resilience if – for example – the application use case might involve pharmaceutical application use or other mission/life-critical use cases), the team looked at many Snowpark technologies including Data Preparation ML Models functionality for real world Python operations work.
Once data models are trained in Snowpark, data models can be deployed as a User Defined Function (UDF) so that Python programmers in particular can work to get data models into production without the risk of rouge functions coming into the software supply chain.
One final customer use case was also showcased with DTCC, a company focused on cross0-trade processing. Depository Trust and Clearing Corporation (DTCC) advances solutions that help markets grow and protect the security of the global financial system.
The company has been using Snowflake for around five years and says that the company is most excited about the Snowflake Native Application Framework – this is because financial firms typically work with batch-based data, but DTCC says it wants to offer more control and flexibility to its users so that they can run not just standard workloads, for more experimental implementations.
Data, logic and UX all work together (and Snowflake Streamlit enables users to build data workloads without being a specialist in UX, UI or any major form of front-end development), this particular user say that they have team members that have moved from being Oracle professionals to now build up large portions of Snowflake skilling inside their teams.
Overall, it’s good to see a company showcasing the C-suite message up front, the development team’s efforts (with real live coding on show) and some genuine use cases also on show.
As they say… let it snow.