Picture for Roy H. Perlis

Roy H. Perlis

Preferential Mixture-of-Experts: Interpretable Models that Rely on Human Expertise as much as Possible

Add code
Jan 13, 2021
Figure 1 for Preferential Mixture-of-Experts: Interpretable Models that Rely on Human Expertise as much as Possible
Figure 2 for Preferential Mixture-of-Experts: Interpretable Models that Rely on Human Expertise as much as Possible
Figure 3 for Preferential Mixture-of-Experts: Interpretable Models that Rely on Human Expertise as much as Possible
Figure 4 for Preferential Mixture-of-Experts: Interpretable Models that Rely on Human Expertise as much as Possible
Viaarxiv icon

Prediction-Constrained Topic Models for Antidepressant Recommendation

Add code
Dec 01, 2017
Figure 1 for Prediction-Constrained Topic Models for Antidepressant Recommendation
Figure 2 for Prediction-Constrained Topic Models for Antidepressant Recommendation
Viaarxiv icon

Prediction-Constrained Training for Semi-Supervised Mixture and Topic Models

Add code
Jul 23, 2017
Figure 1 for Prediction-Constrained Training for Semi-Supervised Mixture and Topic Models
Figure 2 for Prediction-Constrained Training for Semi-Supervised Mixture and Topic Models
Figure 3 for Prediction-Constrained Training for Semi-Supervised Mixture and Topic Models
Figure 4 for Prediction-Constrained Training for Semi-Supervised Mixture and Topic Models
Viaarxiv icon