Agentforce: The New AI Wave

Last month, I attended Dreamforce 2024, the world’s largest software conference, in San Francisco. This massive annual event is always a great learning experience. Dreamforce’s 2024 key announcement was a New AI Era with Agentforce.

Agentforce is synonymous with AI Agent. As I explained in my previous blog about AI agents, I will explain Agentforce in the context of Salesforce/MuleSoft.

The study found that 90% of businesses say that their industry has become more competitive in the last three years, and 48% say it has become much more competitive. This led to decreased margins, force to more productivity, and transformed businesses to remain relevant in the market for any industry.

So the question is, how do we close this gap and become relevant to the market for any industry?

We started the AI journey with Predictive analytics as the first wave of AI. Next, we move into the Generative AI wave. Now we are next inflection point as Agentforce or AI agent. So AI Agent is waiting for us to ultimately close this gap and of course, the way that we’re going to do this is to get more time back, more productivity, and have more business growth with AI agents.

agentforce

So here are a few queries, I am trying to explain

What is Agentforce?

The newest Salesforce tool allows customers to build and customize autonomous agents to scale their workforce. It is a UX for customers to leverage with their data sources to deliver more human-like interactions.

How does Agentforce help customers achieve business goals?

Agentforce gives companies a 24/7 agent to engage on their behalf to resolve sales, service, and marketing-related.

topics including customer service cases and prospect engagement.

With Agentforce, companies can drive productivity to deliver higher profitability, while building stronger customer relationships.

How does MuleSoft enhance Agentforce?

Salesforce primarily focuses on the front end “human assistant” type of agents with the Agentforce UX, while MuleSoft primarily focuses on back-end domain expert agents who manage domain complexity (inventory, payroll.) and power other prompts or agents.

MuleSoft expands the actionability of the Agentforce agent by providing API actions and other domain assets for

broader context to the role, knowledge, actions, guardrails, and channel.

How are customers accessing data for Agentforce?

The Agentforce messaging encourages customers to use Data Cloud to bring in their data and ground Agentforce. To add MuleSoft into this conversation, leverage our value prop for MuleSoft + Data Cloud; where MuleSoft accelerates value against four use cases (on-premises, transactional, unstructured, activation):

On-premise data: MuleSoft can run locally and stream data to Data Cloud, giving Agentforce additional context for improved grounding and better decision making.

Transactional data: Transactional systems will want queuing, error handling, and delivery controls for ingestion

— functionality MuleSoft can easily deliver so that Agentforce agents aren’t slowed down.

Unstructured data: MuleSoft offers pre-built accelerators for unstructured data ingestion to Google Drive,

Confluence, and SharePoint as well as OCR for images. Agentforce agents can have immediate access to data

from scanned images like government identification.

Activation: Use MuleSoft to respond to data events in Data Cloud and drive action in real time to any downstream system for full circle updates.

What is the agent use cases that MuleSoft supports?

● Service Agents: Agentforce needs contextual data from external systems in order to deflect cases faster

● Sales Agents: MuleSoft can upload, and share leads from and with partners without compromising data integrity, securely with your governance rules. Near real-time synchronization with external systems ensures that Agentforce can engage with prospects starting at the moment leads come in.

● Commerce Agents: Setting up and managing storefronts requires data from external systems including product information, inventory levels, and pending vendor shipments. MuleSoft connects to external systems for near real-time updates so Agentforce can respond with accurate information.

● Employee Service Agents (Workday): Automating onboarding and provisioning for new hires requires data from external systems, and in some cases is unstructured data found in pdf, jpg, and png files like scanned government I.D.s and manually filled out forms. MuleSoft’s Intelligent Document Processing makes it easier to upload unstructured data so that you can share it faster with Agentforce.

How is Agentforce different from the MuleSoft AI Chain (MAC) Project?

MAC Project mainly targets a technical person, i.e. MuleSoft users and developers. With the MAC Project, customers can create powerful agents, fully composed in the MuleSoft Anypoint Platform and benefit from its End-to-End Lifecycle Governance and Management capabilities. With API Management, you can sprinkle it on top of LLM specific policies, to further implement the security aspects when interacting with LLMs. MAC Project is an open source project, which is currently being productized. Agentforce is more for non-technical users who wants to build powerful agents directly in Salesforce. It is fully integrated into every Salesforce Cloud and provides out-of-the-box integration to the Salesforce ecosystem.

AI-Powered Experiences
Connect | Automate | Scale

Over the last few years, Generative AI has played a significant role across organizations. It is also very  interesting that just 2% expect few to no barriers to bringing Generative AI into their organization.

In IT, change is the only constant. We migrated to the cloud, we’re managing an explosion of customer data, and we’re starting to automate our processes. We expect this AI inflection point more nervous than other big waves of innovation. To manage these inflection points it is very important to streamline our AI journey.

Our first priority is to unlock our data and make it discoverable. We need to create new experiences to unlock your data, from anywhere, and to make it discoverable. This includes on-premises, hybrid/cloud data, as well as data in any format, including structured and unstructured data. Integration/APIs help you to build a framework to unlock data across all of your disparate systems.

Since data is everywhere and sources are spread across your organization. It is a human-centric task. To mitigate these human-centric tasks we need to create workflows & automate manual tasks across structured and unstructured data with minimal coding. This can be achieved by leveraging APIs, data cloud, and automation tools like RPA and IDP.

Next, we talked about the importance of building securely. With a backlog of ongoing projects, we need a way to scale the use of these API building blocks across the business, with security and governance. We need a way to protect and implement security policies across every API in your digital space before you launch your next application, like an e-commerce platform or even a mobile app. Universal API Management allows us to bring security and governance to any API.

And finally, we need just one more piece – an AI model. AI model interacts with LLMs via an API. As we make our inventory data discoverable, composable, and automated – we can build those experiences using AI models. when we bring these technologies together with an LLM, we can create intelligent AI-driven experiences. We can implement predictive and generative capabilities by using discoverable and consumable data via APIs.

API Security

Modern-day APIs are the building block for integration and application for any organization. Every day organizations are using APIs to unlock new features and enable innovation. From banks, retail, and transportation to IoT, autonomous vehicles, and smart cities, APIs are a critical part of modern mobile, SaaS, and web applications and can be found in customer-facing, partner-facing, and internal applications.

Organizations are exposing sensitive data, such as Personally Identifiable Information (PII) through APIs, and because of this have increasingly become a target for attackers. Due to this organizations are concerned about their API security & compliance. API Security focuses on strategies and solutions to understand and mitigate the unique vulnerabilities and security risks of Application Programming Interfaces (APIs). According to the Open Web Application Security Project (OWASP) 2023, these API threats are categorized into 10 different categories

  1. Broken Object Level Authorization (BOLA) – Object-level authorization is an access control mechanism that is usually implemented at the code level to validate that a user can only access the objects that they should have permission to access.
    Comparing the user ID of the current session (e.g. by extracting it from the JWT token) with the vulnerable ID parameter isn’t a sufficient solution to solve Broken Object Level Authorization (BOLA).

    For example, any API providing a listing of all school revenue based on the school’s name of any county could be a security threat like this API endpoint: /county/{schoolName}/revenues.
    Hacker simply manipulates {schoolName} in the above endpoint’s school name to get all revenue details for all schools.

    To mitigate this risk Use the authorization mechanism to check if the logged-in user has access to perform the requested action on the record in every function that uses an input from the client to access a record in the database.
  2. Broken Authentication – API authentication is very vulnerable and an easy target for attackers. Attackers can gain complete control of other users’ accounts in the system, read their personal data, and perform sensitive actions on their behalf.

    API authentication flow and process need to be well protected and “Forgot password / reset password” should be treated the same way as authentication mechanisms. Make sure you know all possible flows to authentication to API (Mobile/Web/any link) and it gets well protected with authentication.
  3. Broken Object Property Level Authorization – When authorizing a user to access an object using an API endpoint, It is very important to validate that the user has permission to access the specific or all object properties.
    An API endpoint is considered as vulnerable if :
    • The API endpoint exposes properties of an object that are considered sensitive and should not be read by the user.
    • The API endpoint allows a user to change, add/or delete the value of a sensitive object’s property which the user should not be able to access.

      When you are exposing any API endpoint, always make sure that the user has access to the object’s properties you expose and avoid using any generic methods like to_json() and to_string().
  4. Unrestricted Resource Consumption – Enabling any API request, requires resources such as network bandwidth, CPU, memory, and storage. These resources have limited bandwidth and money associated with these resources.

    It is easy to exploit these resources by simple API calls or multiple concurrent requests. An API is vulnerable if at least one of the following limits is missing or set inappropriately.
    • Execution timeouts
    • Maximum allowable memory
    • Maximum number of file descriptors
    • Maximum number of processes
    • Maximum upload file size
    • Number of operations to perform in a single API client request (e.g. GraphQL batching)
    • Number of records per page to return in a single request-response
    • Third-party service providers’ spending limit
  5. Broken Function Level Authorization If any of the administrative API flows like delete, update, or create expose to unauthorized users it will be an easily vulnerable API endpoint. The best way to find broken function level authorization issues is to perform a deep analysis of the authorization mechanism while keeping in mind the user hierarchy, different roles or groups in the application, and asking the following questions:
    • Can a regular user access the administrative endpoint?
    • Can a user perform sensitive actions (e.g. creation, modification, or deletion) that they should not have access to by simply changing the HTTP method (e.g. from GET to DELETE)?
    • Can a user from Group X access a function that should be exposed only to users from Group Y, by simply guessing the endpoint URL and parameters?

      To mitigate this risk, the enforcement mechanism(s) must deny all access by default, requiring explicit grants to specific roles for access to every function.
  6. Unrestricted Access to Sensitive Business Flows — When you create an API endpoint some endpoints are more sensitive and critical than others. It is very important to understand which API endpoint and business flow you are exposing to the customer. Any restricted business flow exposed to clients can harm your business. In general, technical impact is not very severe but business impact might hurt your company’s credibility.

    For example, if your company offers a discount for one customer 20% and another customer 30% through API, if the first customer knows this discount variation, it will impact the credibility of the company as well as revenue loss.
    The mitigation planning should be done in two layers:
    • Business – identify the business flows that might harm the business if they are excessively used.
    • Engineering – choose the right protection mechanisms to mitigate the business risk.
  7. Server-Side Request Forgery – Server-Side Request Forgery (SSRF) vulnerability occurs when you are consuming remote APIs and resources without validating the remote endpoint or user-supplied URL. SSRF enables attackers to force the application to send formatted requests to an unknown destination even if protected by a firewall. Successful exploitation might lead to internal services enumeration (e.g. port scanning), information disclosure, bypassing firewalls, or other security mechanisms.

    The SSRF risk cannot be eliminated but you can mitigate these risks by isolating the resource fetching mechanism in your network, accepting media types for a given functionality, disabling HTTP redirections, Validating and sanitizing all client-supplied input data, and Using a well-tested and maintained URL parser to avoid issues caused by URL parsing inconsistencies.
  8. Security Misconfiguration — Security Misconfiguration vulnerability occurs when the latest patches are missing on the server or systems are outdated, Transport Layer Security (TLS) is missing, A Cross-Origin Resource Sharing (CORS) policy is missing, Error messages include stack traces or expose other sensitive information. Attackers often attempt to find unpatched flaws, common endpoints, services running with insecure default configurations, or unprotected files and directories to gain unauthorized access or knowledge of the system. These Security misconfigurations not only expose sensitive user data but also system details that can lead to full server compromise.

    Security misconfiguration risk can be mitigated by a repeating hardening process leading to fast and easy deployment, ensuring all communication happens over an encrypted communication channel (TLS), and implementing a proper Cross-Origin Resource Sharing (CORS) policy.
  9. Improper Inventory Management — It is important for organizations not only to have a good understanding and visibility of their own APIs and API endpoints but also how the APIs are storing or sharing data with external third parties. Multiple versions of APIs need to be properly managed, secure, patched and well-documented. Hackers usually get unauthorized access through old API versions or endpoints left running unpatched and using weaker security. requirements.
    Improper Inventory Management security vulnerability can be mitigated by documenting all hosted APIs for all environments (Prod or Non-Prod), Generating documentation automatically by adopting open standards and avoiding using production data with non-production API deployments.
  10. Unsafe Consumption of APIs — Unsafe Consumption of APIs vulnerability occurs when your developers tend to adopt weaker security standards, for instance, in regard to input validation, sanitization, URL redirections and not implementing timeouts for interactions with third-party services.
    This vulnerability can be mitigated by implementing proper data validation, and schema validation. Ensuring all API interaction happens on secured communication channels like TLS. Maintain an allowlist of well-known locations integrated APIs may redirect yours to do not blindly follow redirects.

Recession: Impact in Software as a Service(SaaS) 

Global uncertainties continue to dominate headlines. Inflation is expected to reach the highest levels of ~3.5% in the US and Europe by the end of 2023. To ease inflation, Central Banks need to dampen demand, by making it expensive (for financial institutions, businesses and households) to borrow by increasing Federal Reserve interest rates . We are expecting a federal rate hike of 4.75% – 5.0% by the end of 2023. These are all data showing we are heading toward recession. The US labor market was robust last quarter but this quarter it is not very promising. Everyday we are hearing layoff news from different sectors.

IMF inflation forecast

These inflation and layoff news are impacting our tech market. Many companies have a growth challenge: They expect to get as much as 50 percent of their revenue from new businesses and products by 2026 but are not on a path that will take them there. Current economic conditions are forcing high-growth yet unprofitable tech startups to tighten their financial belts.

There are few realities, software companies are facing for their growth.

US-based Venture capitalists backed software startups slowed down – VC are very clear of high valuation and demanding that companies spend less, improve profit margin and high output. Unicorn creation also slowed in 2022 Q4. This is one of the lowest quarterly count since the first quarter of 2020.

Depressed company valuations – Private company valuations are cooling down. Over the last 4 quarters, we have seen public valuations compressing.

 Software companies have three critical revenue streams.

  1. License / Subscription Revenue – When the customer pays for the right to own and use a copy of the software/hardware product or subscribe/access  software platform
  2. software or hardware product – Customer pays for ongoing support or premium support.
  3. Cloud based licensed software – Customer pays the software provider for specific deliverables such as software implementation or technical training.

In the current world all these 3 revenue streams are shrinking. Companies are using only essential services to run their business. This is directly impacting software revenue, which is leading these companies into low valuation.

Infrastructure Maintenance –  SaaS companies are providing the software as a service. This means the customer does not have to purchase hardware to run the software—that cost is transferred to the SaaS provider. This is implying continuous software running coast. This cost is not going anywhere.So due to inflation this SaaS running cost increases tremendously.

API Security

API is a key component of digital transformation. API is the interface of your legacy and SAAS data. The goal of APIs is to facilitate the transfer and enablement  of data between your system and external users. APIs are typically available through public networks like the internet to communicate to external users and expose your data into the public domain.

Since your data is exposed into the public domain through APIs, It can lead to a data breach. APIs can be broken and expose sensitive personal as well as company data. An insecure API can be an easy target for hackers to gain access to your system and network. Rise of IOT devices and usage of APIs by these IOT devices, APIs are now more vulnerable. 

According to owasp, these are 10 main API vulnerabilities.

  1. Broken Object Level Authorization – Expose endpoints that handle object identifiers, creating a wide attack surface Level Access Control issue.
  2. Broken User Authentication – Authentication mechanisms are implemented incorrectly.
  3. Excessive Data Exposure – Developers  expose all object properties without considering their individual sensitivity
  4. Lack of Resources & Rate Limiting – APIs do not impose any restrictions on the size or number of resources that can be requested by the client/user, lead to Denial of Service (DoS) attack on APIs
  5. Broken Function Level Authorization Complex access control policies with different hierarchies lead to authorization flaws.
  6. Mass Assignment – Without proper properties filtering based on an allowlist, usually leads to Mass Assignment.
  7. Security Misconfiguration – Misconfiguration or lack of Security configuration  is commonly a result of insecure APIs
  8. SQL Injection SQL Injection occurs when untrusted data is sent to an interpreter as part of a command or query.
  9. Improper Assets Management – APIs tend to expose more endpoints than traditional web applications lead to improper expose APIs.
  10. Insufficient Logging & Monitoring – Insufficient logging and monitoring fail to find your vulnerability and broken integration.

How to mitigate API security risk?

  • API supports secure sockets layer (SSL), transport layer security (TLS), and Hypertext Transfer Protocol Secure (HTTPS) protocols, which provide security by encrypting data during the transfer process.
  • Apply Basic Auth minimum with API or  if you want to more secure your API then enable 2 way authentication through OAuth framework . 
  • Apply Authorization on each API resource to more control on API security through external Identity and access management provider (IAM).
  • Use encryption and signatures to all your API exposed personal and organizational sensitive data.
  • Apply API throttling through API manager to control number of user access per API (Rate Limiting).
  • Implement best practice of exception handling on your APIs to hide all your internal server and database information to mitigate SQL injection security risk.
  • Use Service Mesh to manage different layers of API management and control.
  • Audit your APIs and remove all unused API from your API catalog.
  • Add proper logging, Monitoring and Alerting on your APIs to keep track of your APIs activity.

Conclusion: APIs are a critical part of modern AI, mobile, SaaS, IOT and web applications. APIs Security should be the main focus on strategies and solutions to mitigate the unique vulnerabilities and security risks .

Mulesoft Performance Tunning

API Performance Tunning
  1. Keep the application synchronous if possible. Synchronous flows avoid serialization/deserialization of messages sent through VM queues, do not cause context switches, and do not cause contention when messages move across thread pools.
  1. Store as little as possible in variables. The vars are serialized and deserialized every time a message crosses an endpoint, even if it is a VM endpoint. This will impact performance overhead in direct proportion to the size of variables and the number of endpoints. 
  1. Use Dataweave Java payloads whenever possible. The usage of a canonical data model is recommended for projects that deal with data (mapping, transformation etc.). It is also recommended to create them in Java objects as dataweave whenever possible, as this provides the fastest format to access fields and change information and to convert to other formats.
  1. Encourage dataweave  languages. For better performance, use Dataweave for simple data extraction from messages, and Java components with dataweave for everything else. 
  1. Use flow references instead of VM endpoints. To communicate between flows internally within an application, use flow references instead of VM endpoints. The VM connector, even though it is an in-memory protocol, emulates transport semantics that serialize and deserialize parts of your messages, most notably the vars. This makes it slower than a flow reference, which just injects messages into the referenced flow with no intermediate steps. Please note that in some cases the usage of VM endpoints is preferred (see the chapter on reliability patterns). For example, a Mule cluster can load balance applications that use VM endpoints by deferring execution to another, available node in the cluster.
  1. Cache aggressively. Take advantage of Mule’s caching scope when making requests to external resources like Web services or databases. Also consider caching reusable assets such as security tokens or ephemeral API keys and cookies. Mule’s Notification subsystem can additionally be used to “warm up” a cache when Mule starts. For example, consider doing this for situations where an initial cache miss is not acceptable.
  1. Configure message processors and endpoints at the global level. Some connectors allow you to configure some parameters at both the global and the endpoint/message processor level. We recommend placing the configuration at a global level to avoid repeated initialization of resources. 
  1. Avoid creating a large volume of business events. Business events incur performance overhead in Mule and in platform when platform’s internal event buffer overflows. Thus, avoid using either default flow level business events or a large volume of custom business events in a high message volume project.
  1. Consider using message compression. For communicating between Mule applications over the network consider using Mule’s compression processors to compress/decompress the message payloads before they hit the wire if their sizes are large.
  1. Consider using VM queues instead of an external message broker. VM queues are fast and have some guaranteed delivery semantics in a cluster. Consider using these instead of going out to an external messaging broker for inter-application Mule communication.
  1. Use the async scope when appropriate. If a flow is performing processing on a message that is neither modifying the message nor changing how it is routed, then it could be wrapped in an async block. This will cause the processing to occur in a different thread and will avoid adding unnecessary overhead to processing the message.
  2. Use connection pooling for connectors because the performance cost of establishing a connection to another data source, such as a database, is relatively high.
  3. Optimize your logging within your mule flows. Too much logging will slow down your process and too less logging will hard to debug.
  4. Encryption and decryption of data is very costly. Whenever your Mule application really needs then apply encryption/decryption on your data.

APIs Integration with IOT and CRM improves Customer Service

APIs integration helping IOT and CRM to enable better customer experience

IOT (Internet of things) is revolutionizing our lives. As per Gartner report by 2025 IOT market will expand a 58-billion-dollar opportunity. It is affecting all parts of our life. In our pandemic era we found more use of IOT device to maintain social distancing.

IOT is also one of the main disruptive technologies in our businesses. It is affecting all business domain including healthcare, retail, automotive, security.

There are wide range of IOT benefits in business.

  • Enhanced productivity
  • Better customer experience
  • Cost-effectiveness

CRM system is keeping all your customer relationship like data, notes, metrics and more – in one place. CRM is helping small business to take off all burden from the IT management team by automating the business process. It is also helping employee to keep the focus on the critical business areas.

API is helping to integrate these two unrelated systems. APIs are enabling this system to optimize process and streamline whole business process. API is the main communication channel to build robust process and keeping real time update to these systems. APIs are allowing to build context-based application with IOT and CRM to interact with the physical world.

Now here are few areas where IOT is helping CRM system with help of APIs to optimize business process.

  1. Optimize customer service – Before your customer finds any error in your service/product you proactively acting on error and fixing those error. This will help to build relationship with customer.
  2. Increase sales – With help of IOT and CRM system you are finding untouched opportunity and using those opportunity to increase your sale.
  3. Personalize customer experience – You are analyzing data provided by IOT and CRM system and building user based predictive model to enable personalize experience to user.
  4. Customer retention – CRM provide customer data and relationship. IOT data providing customer behavior. This will help any business to personalize and target marketing for their customer.
  5. Omnichannel instore experience – IOT and CRM is helping business to enable 360 omnichannel customer experience. This process will help and suggest the products which the customer might purchase.

APIs  integration with IOT and CRM helping business to enable higher degree of personalization, target marketing, optimize price model, higher revenue and enhance customer satisfaction.

Anypoint Platform: External (OKTA) Identity Management

Anypoint Platform acts as a client provider by default, but you can also configure external client providers to authorize client applications. As an API owner, you can apply an OAuth 2.0 policy to authorize client applications that try to access your API. You need an OAuth 2.0 provider to use an OAuth 2.0 policy. You can configure more than one client provider and associate the client providers with different environments. If you configure multiple client providers after you have already created environments, you can associate the new client providers with the environment. 

MuleSoft supports client management by identity providers that implement the OpenID Connect Dynamic Client Registration open standard. MuleSoft explicitly verifies support in Anypoint Platform for Salesforce, Okta, and OpenAM v14 Dynamic Client Registration. The following table contains examples of the URLs you need to supply, depending on your provider, during registration.

URL NameOkta Example URLOpenAM Example URLSalesforce Example URL
Base https://example.okta.com/oauth2/v1 https://example.com/openam/oauth2 https://example.salesforce.com/services/oauth2
Client Registration {BASE URL}/clients {BASE URL}/connect/register {BASE URL}/register
Authorize {BASE URL}/authorize {BASE URL}/authorize {BASE URL}/authorize
Token {BASE URL}/token {BASE URL}/access_token {BASE URL}/token
Token Introspection {BASE URL}/introspect {BASE URL}/introspect {BASE URL}/introspect
URL Name Okta Example URL OpenAM Example URL Salesforce Example URL

Steps to Create External Client Provider

  • Log in to Anypoint Platform using an account that has the organization administrator role.
  • In Anypoint Platform, click Access Management.
  • In the menu on the left, click Client Providers.

  • Click Add Client Provider, and then select OpenID Connect Dynamic Client Registration.
    The Add OIDC client provider page appears.
  • After obtaining values from your identity provider’s configuration, complete the following required fields in each section:
    • Dynamic Client Registration
      • Issuer: URL that the OpenID provider asserts is its trusted issuer.
      • Client Registration URL: The URL to dynamically register client applications as a client application for your identity provider.
      • Authorization Header
        • For Okta, this value is SSWS ${api_token}, where api_token is an API token created through Okta.
        • For ForgeRock, this value is Bearer ${api_token}, where api_token is an API token created through ForgeRock.
        • For Salesforce, this value is Bearer ${api_token}, where api_token is an API token created through Salesforce. In Advanced Settings you can also select:
      • Disable server certificate validation: Disables server certificate validation if your OpenID client management instance presents a self-signed certificate, or one signed by an internal certificate authority.
      • Enable client deletion in Anypoint Platform: Enables deletion of clients created with this integration.
      • Enable client deletion and updates in IdP: To use this option, you must also select the Enable client deletion in Anypoint Platform option.
    • Token Introspection Client
      • Client ID: The client ID for an existing client in your IdP capable of introspection of all tokens from all clients.
        • For Okta, this value should be a “Confidential” client.
        • For ForgeRock, this value should be a “Confidential” client.
        • For Salesforce, this value should be a “Confidential” client.
      • Client Secret: The client secret that corresponds to the client ID.
    • OpenID Connect Authorization URLs
      • Authorize URL: The URL where the user authenticates and grants OpenID Connect client applications access to the user’s identity.
      • Token URL: The URL that provides the user’s identity, encoded in a secure JSON Web Token.
      • Token Introspection URL: endpoint that returns metadata about the access token, including expiration and token active state.

Mule 4: Consume a SOAP Webservice

The Web Service Consumer is an existing connector in Mule 4 that you can configure to point to a SOAP based web service. Webservice consumer call webservice hosted elsewhere as WSDL SOAP services and get response. This connector simplified process and encapsulated all the feature to consume SOAP based webservice. When no connector is available specific to any product (like Service-Now, Workday etc.), which is hosted as SOAP based webservice then this webservice consumer Connector enables any services to consume.

The main feature of this connector is

  • Consuming DOC Literal Web services.
  • SOAP multipart messages.
  • SOAP Headers.
  • DataSense support for SOAP Headers, SOAP Body, and Attachment.
  • Embedded DataWeave transformations inside the operation.
  • Support and Unified experience for SOAP with attachments and MTOM handling.
  • Custom HTTP configuration as transport (runtime and design time).
  • Web Service Security (WS Security) support.

Connector Configuration- In this section we define connector configuration to communicate with SOAP based webservice end point. By default, connector uses a simple non protected HTTP configuration to send all outgoing SOAP message.  In connector configuration you can select your SOAP version from drop down  and provide WSDL location. Connector extract and populates Service, Port and webservice endpoint address from WSDL file.

But if you are using secure endpoint address with HTTPS you need to configure custom Transportation Configuration for HTTPS.

These are the steps to enable your secure HTTPS endpoint.

  • Create jks file with keytool command
 keytool -keystore clientkeystore.jks -genkey -alias client 
  • Download certificate from WSDL HTTPS endpoint and add this certificate in your JKS file with below command
keytool -importcert -file certificate.cer -keystore clientkeystore.jks -alias "Alias"
  • Now configure TLS Context for Webservice consumer connector.
<tls:context name="TLS_Context" doc:name="TLS Context" doc:id="f634b824-2695-4d5f-8789-7a309b1511cb" >
           <tls:trust-store path="certificate/clientkeystore.jks" password="xxxxxx" type="jks" />
     </tls:context>
  • Now configure HTTP Request configuration for HTTPS endpoint.
<http:request-config name="HTTPS_Request_configuration" doc:name="HTTPS Request configuration" doc:id="02db1fd9-9f04-4eae-83cf-df43effd25d2">
           <http:request-connection protocol="HTTPS" host="service.vanrish.com" port="443" tlsContext="TLS_Context">
     	   </http:request-connection>
	</http:request-config>

  • If TLS and HTTPS configuration configured then you can select HTTP request configuration from Webservice consumer
<wsc:config name="BookService_Web_Service_Consumer_Config" doc:name="Book Web Service Consumer Config" doc:id="59fd0d73-f90d-4cf0-9855-c008307067a2" >
 <wsc:connection wsdlLocation="wsdl\bookservice.wsdl" service="BookService" port="BookServicePort" address="https://service.vanrish.com:443/service/BookService">
  <wsc:custom-transport-configuration >
    <wsc:http-transport-configuration requesterConfig="HTTPS_Request_configuration"/>
  </wsc:custom-transport-configuration>
 </wsc:connection>
</wsc:config>

Connector Parameter- If connector configuration is configured properly, your operation parameters are available from WSDL as drop down options.

In Message section there are three parameters available

  1. Body – The Body is main part of the SOAP message. The body element accepts embedded DataWeave scripts as values so that you can construct the XML request without having a side effect on the message or having to use multiple components to create the request.
  2. Headers – The headers element contains application-specific information (like authentication, payment, and so on) about the SOAP message . This elements accepts embedded DataWeave scripts as values.
  3. Attachment – The attachments element enables you to bind attachments to the SOAP message. This element also accepts embedded DataWeave scripts as values.

Since you configured custom HTTPS connector for your webservice consumer Connector you can configure Transport Configuration. In Transport header section you can select “Edit inline” and add all your header parameters in line

<wsc:consume doc:name="Consume" doc:id="ca5a1247-7cf6-4c7f-a442-b6fd037c13c9" config-ref="BookService_Web_Service_Consumer_Config" operation="AddBook">
       <wsc:transport-headers >
          <wsc:transport-header key="SOAPAction" value="AddBook" />
          <wsc:transport-header key="Content-Type" value="text/xml; charset=UTF-8" />
          <wsc:transport-header key="Authorization" value="${book.authorization}" />
       </wsc:transport-headers>
 </wsc:consume>

Here is webservice consumer flow diagram

Code for this flow

<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:ee="http://www.mulesoft.org/schema/mule/ee/core"
	xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns:tls="http://www.mulesoft.org/schema/mule/tls"
	xmlns:wsc="http://www.mulesoft.org/schema/mule/wsc"
	xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="
http://www.mulesoft.org/schema/mule/ee/core http://www.mulesoft.org/schema/mule/ee/core/current/mule-ee.xsd http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/wsc http://www.mulesoft.org/schema/mule/wsc/current/mule-wsc.xsd
http://www.mulesoft.org/schema/mule/tls http://www.mulesoft.org/schema/mule/tls/current/mule-tls.xsd
http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd">
	
	<wsc:config name="BookService_Web_Service_Consumer_Config" doc:name="Book Web Service Consumer Config" doc:id="59fd0d73-f90d-4cf0-9855-c008307067a2" >
            <wsc:connection wsdlLocation="wsdl\bookservice.wsdl" service="BookService" port="BookServicePort" address="https://service.vanrish.com:443/service/BookService">
	            <wsc:custom-transport-configuration >
					<wsc:http-transport-configuration requesterConfig="HTTPS_Request_configuration" />
				</wsc:custom-transport-configuration>
            </wsc:connection>
     </wsc:config>

	<tls:context name="TLS_Context" doc:name="TLS Context" doc:id="f634b824-2695-4d5f-8789-7a309b1511cb" >
           <tls:trust-store path="certificate/clientkeystore.jks" password="changeit" type="jks" />
     </tls:context>

    <http:request-config name="HTTPS_Request_configuration" doc:name="HTTPS Request configuration" doc:id="02db1fd9-9f04-4eae-83cf-df43effd25d2">
           <http:request-connection protocol="HTTPS" host="service.vanrish.com" port="443" tlsContext="TLS_Context">
     	   </http:request-connection>
	</http:request-config>

	<sub-flow name="addbook-ServiceSub_Flow" doc:id="511f0969-0b7d-4b7e-a113-60ef03e97648" >
             <logger level="INFO" doc:name="Logger" doc:id="e6bd0106-e512-4fdd-97cf-1dbd77e1e0e7" message="Entering into AddBook flow"/>
                             <ee:transform doc:name="Transform Message" doc:id="06cc17de-86a9-4c53-a2f4-167d9561bed9" >
                                           <ee:message >
                                                          <ee:set-payload ><![CDATA[%dw 2.0
 output application/xml skipNullOn="everywhere"
 ns n0  https://www.service.vanrish.com/BookService/
  ---
   n0#AddBook:
         {
                 n0#Book : {
                 	ID: payload.id,
                 	Title : payload.title,
                 	Author : payload.author
                 }
         }]]></ee:set-payload>
                                           </ee:message>
                             </ee:transform>
                             <logger level="INFO" doc:name="Logger" doc:id="ce84f628-7b38-4d2d-b5e3-9fdded2c9289" message="soap request --> #[payload]"/>
                             
<wsc:consume doc:name="Consume" doc:id="ca5a1247-7cf6-4c7f-a442-b6fd037c13c9" config-ref="BookService_Web_Service_Consumer_Config" operation="AddBook">
                                           <wsc:transport-headers >
                                                          <wsc:transport-header key="SOAPAction" value="AddBook" />
                                                          <wsc:transport-header key="Content-Type" value="text/xml; charset=UTF-8" />
                                                          <wsc:transport-header key="Authorization" value="${book.authorization}" />
                                           </wsc:transport-headers>
                             </wsc:consume>
                             <logger level="INFO" doc:name="Logger" doc:id="680d69e0-2b01-480c-afe7-660ca22b2f9f" message="AddBook Output-->#[payload]"/>
                             <ee:transform doc:name="Transform Message" doc:id="72d26561-107a-4c6e-a7d4-85bd18e0d316" >
                                           <ee:message >
                                                          <ee:set-payload ><![CDATA[%dw 2.0
ns ns0 https://www.service.vanrish.com/BookService/
 

output application/json skipNullOn="everywhere"
---
payload.body.ns0#AddBookResponse]]></ee:set-payload>
                                           </ee:message>
                             </ee:transform>
                             <logger level="INFO" doc:name="Logger" doc:id="ea517185-efa4-4bf2-a03f-e8bd4d308e80" message="Output AddBook --> #[payload]"/>
              </sub-flow>
</mule>

Mulesoft 4: Using Java in Dataweave 2.0

Mule 4 introduces DataWeave 2.0 as the default expression language replacing Mule Expression Language (MEL). DataWeave 2.0 is tightly integrated with the Mule 4 runtime engine, which runs the scripts and expressions in your Mule application.

Since Dataweave 2.0 is default expression language for Mule 4, Dataweave can use almost all place within your Mule application. So, In some use-case Dataweave needs to call java method or instantiate java class to execute java complex business logic.

In my previous blog I explained usage of java within Mulesoft flow. In this blog I am explaining usage of java within Dataweave 2.0.

There are 2 ways we can use java within Dataweave code

  1. Calling java method
  2. Instantiate Java class
1. Calling java method There is restriction with Dataweave when calling to java. you can only call Static methods via DataWeave (methods that belong to a Java class, not methods that belong to a specific instance of a class). Before making a method call from java class, you must import the class.

Here is Dataweave code

In dataweave this method can use multiple way

Import only method instead of the whole class:

%dw 2.0
import java!com::vanrish::AppUtils:: encode
output application/json
---
{
encode:encode("mystring" as String)
}

import and call it in a single line:

%dw 2.0
output application/json
---

    encode: java!com::vanrish::AppUtils:: encode ("mystring" as String)
}
2. Instantiate Java class Dataweave allows to instantiate a new object of any java class but you can’t call its instance method through dataweave. You can refer it as variables.

%dw 2.0
import java!com::vanrish::AppUtils
output application/json
---
{
     value: AppUtils::new().data
}

AppUtils.java

package com.vanrish;

import sun.misc.BASE64Encoder;
/**
 * @author rajnish
 */
public class AppUtils{
	public static final BASE64Encoder encoder = new BASE64Encoder();
        private String data;
	
	/**
	 * @param plainString
	 * @return
	 */
	public static String encode(String plainString)
	{
		String encodedString = encoder.encodeBuffer(plainString.getBytes());
		return encodedString;
	}

/**
	 * @param dataStr
	 * @return
	 */
	public String getData(String dataStr)
	{
              data = dataStr+" : test";
              return  data;
	}
}