From Deep Learning to Deep Reasoning
with Senfino XAI

While Deep Learning is remarkable at pattern recognition, classification, and prediction, it falls short when it comes to personalization, explainability, and understanding its own rationale.

The underlying architecture of Senfino’s Deep Reasoning XAI technology — a radically different AI algorithm design that overcomes Deep Learning’s shortcomings — is fundamentally different under the hood and purpose built for enterprise-grade, human-in-the-loop decision support systems.

From Black Box to Glass Box
Why Does Explanaibility Matter?

Not only does eXplainable AI offer the “why” behind machine-based recommendation, it serves as the connective tissue between man and machine allowing the two parties to better communicate and augment one another.


Not all eXplainable AI is created equally. With most companies trying to convert a black box into a glass box, ours is natively a glass box with explainability engineered into the very foundation.


Humans’ underlying reasoning framework is built upon complex rules. Senfino’s Deep Reasoning AI is built upon a rule-based architecture to model how we ourselves are hardwired to think.

Self Learning

Deep Learning is designed to draw correlations between the most obscure of data. Senfino’s Deep Reasoning leverages ML differently — to infer causal relationships in the data.


Black box models are difficult to tune once trained. Senfino XAI’s models are more fluid and evolutionary, meaning they can adapt to changes in human decision-making through feedback systems.

“I'm not saying I want to forget deep learning. On the contrary, I want to build on it. But we need to be able to extend it to do things like reasoning, learning causality…”

-Yoshua Bengio, the father of Deep Learning

With Senfino XAI at the core
of your enterprise decision support systems…

Trust the Results

It’s risky to take machine-based results at face value. Through explanation, Senfino’s XAI technology delivers more value to end users by coupling recommended courses of action with intelligible insights to machine-based rationale.

Retain Control

Enterprise decision-making is too complex to automate. Senfino’s eXplainable AI employs a human-in-the-loop design built for augmenting decision-making with humans calling the final shots.

Remain Compliant

Policy-makers are wising up to risky, blackbox AI. In regulated industries, compliance mandates bring issues of auditability, transparency, bias, and data privacy to the forefront. Senfino’s XAI is designed from the ground up to natively satisfy all compliance demands.

Our Peer-Reviewed Technical White Papers on the Global Conference Circuit

A Content-Based
Recommendation System Using
Neuro-Fuzzy Approach

Senfino Content Based Recommendation System Using Neuro-Fuzzy Approach provides human machine interpretable explanation in an AI assistant context. Our Neuro-Fuzzy architecture delivers substantial performance improvements returning acutely accurate personalized content and recommendations based on individual behavior without relying on collaborative filtering (crowd sampling).

Towards Interpretability of
the Movie Recommender Based
on Neuro-Fuzzy Approach

Senfino Fast Computing Framework for Convolutional Neural Networks (FCFCNN) embodies unique XAI architecture reducing processing overhead while accelerating forward signal flow. Neurons store reference pointers to corresponding regions of previous input propagating signal flow, eliminating the need to search for connections between layers. Additionally, reference points are batched along with feature maps in multi-feature input containers and treated as vectors, speeding calculations across CNN layers. In benchmark tests of image validation, FCFCNN performed twice as fast as the leading OverFeat CNN.

On Explainable Recommender Systems
Based on Fuzzy Rule Generation

This paper presents an application of the Zero-Order Takagi-Sugeno-Kang method to explainable recommender systems. The method is based on the Wang-Mendel and the Nozaki-Ishibuchi-Tanaka techniques for the generation of fuzzy rules, and it is best suited to predict users’ ratings. The model can be optimized using the Grey Wolf Optimizer without affecting the interpretability. The performance of the methods has been shown using the MovieLens 10M dataset.

According to PWC, XAI is one of the
top 10 hottest trends in AI in 2018

Use our core technology
to supercharge your workflow.
Build your custom XAI solution.

Custom Solutions Inquiry

XAI news from the outside

How Artificial Intelligence Will Change the Airline Passenger Experience

DARPA’s XAI Explainable Artificial Intelligence Future

Five Ways Artificial Intelligence Is Disrupting Asset Management

Is Art Created by AI Really Art?