A smart assistive system is designed and implemented to provide severely disabled patients with a fair level of autonomy and ease of communication. The system is based on an interactive, custom-tailored, artefact-mitigated and fault-tolerant Brain-Computer Interface (BCI) and is designed to work in real-time using an elaborate MATLAB code, Arduino microcontroller, and Emotiv wireless EEG headset with 14 sensors. The system differentiates between involuntary eyeblinks (considered artefacts, hence removed) and deliberate rapid eyeblinks (considered synchronizing signals) used for distress calling, start/stop signalling, as well as fault-tolerance owing to the confirmation of commands before their execution. Two classes of thoughts, custom-tailored to the capabilities of each patient, are used to navigate through an adjustable menu of commands that caters for the individual needs of the user. Independent Component Analysis (ICA) and Correlation are used to detect and mitigate the deleterious effect of the artefacts corrupting the EEG signals intended for classification. A Neural Network operating on sub-band-power-based features extracted with the Wavelet Transform is used as the classifier with a success rate ranging from 82% to 90%. The system can flexibly be adapted to suit various scenarios involving binary load control (on/off of TV, light, A/C, etc…) as well as multilevel control (up/down level of bed, TV volume, room temperature…etc.). The merits of this system have been successfully demonstrated in practice, showing its potential contribution to smart hospitals and patient-care facilities. © 2018 Institution of Engineering and Technology. All rights reserved.