Certainly, there’s a place for beliefs, but they never advanced society. Once upon a time someone tripped over a stone and hurt their leg and that was seen as Gods punishment. Come the age of reason and science and it was simply because they didn’t look where they were going. Someone else discovered penicillin and saved that damaged leg ... conversely, some still believe Obama is a God and that politicians can change the earth’s climate.
There’s a whole set of essays been written on new-age magic elsewhere on the forum. All good fun, but is the west reverting back to beliefs, safe in the comfort that when it all goes wrong science will come to the rescue?