No matches found
Try choosing different filters or resetting your filter selections.
Data Integration: Easier Data Preparation, Write Connectors, Sentiment Analysis
Create recipes with the new, visual editor, Data Prep (Beta).
Write the outcome of recipes to external systems like Amazon S3 and Snowflake. Detect
the sentiment of text such as product reviews and social media posts.
-
Prepare Data with the Next Generation of Data Prep (Beta)
The latest version of Data Prep, called Data Prep (Beta), expands on the intuitive, visual interface that allows you to easily point-and-click your way to build recipes. Use the new graphical view of a recipe to see at a glance where data comes from and how it flows through the recipe to the target. To validate the recipe as you build, you can continue to preview how raw data is transformed at every step of the way. -
Get a Sense of How Your Customers and Prospects Feel
Text information such as product reviews and social media posts can be a mine of information for your business. Use the Detect Sentiment transformation in a Data Prep (Beta) recipe to quickly bucket that information into sentiment categories: positive, negative, and neutral. For example, detect the sentiment of survey responses to evaluate how customers feel about your product support. If more than a certain percentage—say 30%—of the comments are negative, escalate the feedback to support management. -
Output Your Einstein Analytics Data to Amazon S3 (Beta)
Output connectors now let you push your data from Analytics into Amazon S3 when you use Data Prep (Beta). You designed powerful recipes that combine data from multiple sources, add formula fields, and transform data into datasets tailored to your business needs. With output connectors and Data Prep (Beta), datasets are liberated from Analytics and written as one or more .csv files for you to improve your overall business processes with better data. For example, output processed and transformed customer service data for individual agents to understand what they could be doing to improve customer satisfaction. -
Write Your Einstein Analytics Data to Snowflake (Beta)
Output connectors let you push your data from Analytics into Snowflake when you use Data Prep (Beta). You designed powerful recipes that combine data from multiple sources, add formula fields, and transform data into datasets tailored to your business needs. With output connectors and Data Prep (Beta), datasets are liberated from Analytics and written as a table for you to improve your overall business processes with better data. -
Sync Complete TinyInt Column Values from Select AWS RDS Connectors
The full range of TinyInt data column values, from -127 to 127, is now synced from AWS RDS MySQL, Maria DB, and Aurora MySQL connections. Previously, this data type was interpreted as a bit that showed 0 or 1. Update your logic, filters, or transformations that rely on the 0 or 1 behavior to use the actual values instead. -
Filter Data Synced from the Google BigQuery Standard SQL Connector
To speed up data sync and pull only the data that you need, you can now use data sync filters with connected objects from the Google BigQuery Standard SQL connector. Previously, you imported all data from a remote object and used a dataflow or recipe filter to limit the external data in a dataset. Now, you can exclude unnecessary or sensitive data from syncing to Analytics in the first place. For example, set up a filter to import only Canadian marketing data if you are analyzing your Canadian business unit’s progress. -
Sync Remote Data to Government Cloud Orgs with Connectors That Comply with FIPS 140 Requirements
In conjunction with US government cryptographic requirements, Analytics connectors now use FIPS 140 validated encryption. The list of connectors available to Government Cloud orgs will expand as FIPS 140 validated encryption is confirmed. -
Always Get the Latest Data by Scheduling Recipes to Run After Syncs
Determining the right time to schedule a recipe is difficult when you don’t know how long the data sync takes. So stop guessing what time to run the recipe and schedule it to start automatically after the data sync completes. -
Keep Your Data More Up-To-Date with Shorter Schedule Intervals
You can now schedule data sync, dataflow, and recipe jobs to run every 15, 20, 30, or 60 minutes with sub hour scheduling if your org uses an Einstein Analytics Plus license. You previously contacted Salesforce Support to enable subhourly scheduling. If you don’t have an Einstein Analytics Plus license, you must still contact Salesforce Support. This change doesn’t apply to sandboxes. -
Keep Tabs on Recipe Jobs with Notifications
Set recipe notifications to receive an email notification when a recipe finishes. You can be notified only when there are warnings, only when the recipe fails, or every time the recipe finishes. You can also set an elapsed time notification to notify you when a recipe takes longer than a specified length of time. -
Restore Previous Dataflow Versions Easily with History Feature
Using Dataflow History, edit a dataflow and save new versions along with brief descriptions of what’s changed. When you save a new version, it becomes the live version that keeps Analytics data up to date. All previous versions can be viewed through the Analytics Data Manager, and you can quickly revert to or delete any of them.