Universal Containers (UC) has over 10 million accounts with an average of 20 opportunities with each account. A Sales Executive at UC needs to generate a daily report for all opportunities in a specific opportunitystage.
Which two key considerations should be made to make sure the performance of the report is not degraded due to large data volume?
Northern Trail Outfitters has these simple requirements for a data export process:
File format should be in CSV.
Process should be scheduled and run once per week.
The expert should be configurablethrough the Salesforce UI.
Which tool should a data architect leverage to accomplish these requirements?
Universal Containers (UC) has a Salesforce org with multiple automated processes defined for group membership processing, UC also has multiple admins on staff that perform manual adjustments to the role hierarchy. The automated tasks and manual tasks overlap daily, and UC isexperiencing "lock errors" consistently.
What should a data architect recommend to mitigate these errors?
Universal Containers (UC) has a very large and complex Salesforce org with hundreds of validation rules and triggers. The triggers are responsible for system updates and data manipulation as records are created or updatesby users. A majority of the automation tool within UC’’ org were not designed to run during a data load. UC is importing 100,000 records into Salesforce across several objects over the weekend.
What should a data architect do to mitigate any unwanted results during the import?