Epistemic something or 'nother

I came across this in an article about how to respond to flat earthers. Epistemic Contextualism (Stanford Encyclopedia of Philosophy)
They give an example of identifying a zebra, but then the challenge is made, “can you prove it is not a well disguised donkey". The answer would be “no", or “not without a lot of effort", but that doesn’t prove I can’t identify zebras. I can, but in a world where people are disguising donkeys as zebras and are good at, there is now a challenge, about which I am less certain I can identify the animal. I’m also thinking I’m not convinced there are donkey disguisers in the world, so I might focus my efforts on that instead of on the specific zebra/donkey itself.
It’s kind of a technical discussion of a troll tactic; start out seeming reasonable, then shift the context to a higher degree of skepticism. Skip by the fact that you did it or that it’s not normal to do that. In fact acting like it is normal is the tactic. No one could live at the higher level of skepticism you just shifted to, but you just need it to prove that something can’t be proven.

Well, if that didn’t pique your interest, here’s a longer example, the old brain in a vat solipsism argument, used to derail just about any inquiry:
3.1 A Skeptical Puzzle
Consider one particular form of skeptical argument upon which leading contextualists have focused (e.g., Cohen 1986, 1988, 2005a; DeRose 1995; Neta 2003a & b; cf. Unger 1975). We can call it ‘SA’, for ‘skeptical argument’:
P1. I don’t know that not-h [h = some skeptical ‘hypothesis’; e.g., that I’m a bodiless brain in a vat, being stimulated to have just those experiences I would be having if I weren’t a ‘BIV’].
P2. If I don’t know that not-h, then I don’t know that p [p = some mundane proposition which we commonly take ourselves to know; e.g., that I have hands].
C. So, I don’t know that p
SA constitutes a puzzle just because (a) each of the premises enjoys a fair amount of plausibility. As to P1, how, after all, could I know that I’m not a bodiless brain in a vat? By waving my arms around? As to P2, it is just an instance of the closure principle for knowledge,[6] which many are inclined to regard as axiomatic. But (b) given our intuitive anti-skepticism, C seems immensely implausible, even though (c) the argument appears to be formally valid.
On the face of it, then, we are confronted with a paradox—a set of independently very plausible but seemingly mutually inconsistent propositions. Because that is the problem, a complete solution to SA will have to do two things (DeRose 1995): explain which of the assumptions lying behind the generation of the paradox should, in fact, be rejected, and why; and explain too how we got into this mess — or, more formally, why the assumption singled out for rejection struck us as plausible in the first place.
3.2 The General Form of the Contextualist Solution
At first blush, it might seem that there are just three possible responses to SA:
we can simply capitulate to skepticism, accepting SA’s conclusion and abandoning our claims to know such things as that we have hands;
we can reject P2 and the closure principle upon which it trades; or
we can reject P1.
***** Here’s an excerpt from one of the proposed solutions:
Thus an utterance of P1, such as occurs in SA, expresses a truth only because, owing to the introduction of a high-standards context, what it expresses is that the subject does not stand in an extraordinarily strong epistemic position with regard to the proposition that he has hands. But that, of course, is compatible with his meeting lower, though still perhaps quite demanding, standards, such as those in play in more ordinary contexts. As DeRose puts it:
…the fact that the skeptic can…install very high standards which we don’t live up to has no tendency to show that we don’t satisfy the more relaxed standards that are in place in ordinary conversations. Thus…our ordinary claims to know [are] safeguarded from the apparently powerful attacks of the skeptic, while, at the same time, the persuasiveness of the skeptical arguments is explained. (DeRose 1992, 917)

I read the article and it presented quite complicated thoughts in a language slightly beyond my current english capabilities.
If I understand it correctly its a combination of philosophical and mathematical approach to perception, decisionmaking and something what is referred as a knowledge. It reminds me mehods i applied while I was working on game AI.
The example with zebra looks completely flawed to me. Are we looking at a real animal, or a photograph, or a digital simulation which might change its shape?
One of the approaches i implemented to AI was a question “Do I have enough information to make a decision?” and when answer was “No” the AI was directed to obtain more information about the subject. Logic presented in the example ends for me as “lazy written script” where author decided to end the script without getting additional data - which of course is the more complicated way of making a better decision. If it should be a case against Epistemologic Contextualism, its a bad example. Not sure if it was a “strawman” or somebody really tried to argue against it this way.
Examples which were used for EC look usually logical up to same point, where there is made an “negative” assumption, that something is not as it looks. Like the zebra which is not a zebra. There is an unbased doubt, which is later exagerrated and original claim about zebra is abandoned on assumption.
For the purpose of AI i had to order which information are more or less trustworthy.
a) Information gained first hand, verified more than once, unfalsified
b) information gained from other source, verified more than once, unfalsified
c) Information gained first hand, verified at least once, unfalsified
d) Informaiton gained from other source, verified at least once, unfalsified.
e) Information gained first hand, unverified, unfalsified
f) Information gained from other source, unverified, unfalsified
g) Information gained first hand, but falsified at least once
h) Information gained from other source, but falsified at least once
i) Information gained first hand, but falsified multiple times.
j) Information gained from other source, but falsified multiple times.
Lets say we start at E. We looked upon an animal and “its a zebra”.
Problem with the zebra example is that original decision, already made, was retracted without actual falsification, just when a claim was made. My usual position is that there has to be a reason to doubt, not just the emotional feeling, and next reaction should be verification, not retracting of the decision.
After the process we either end up at G if we found out its a Donkey, or at C when its confirmed its an actual zebra. If the examination isnt performed, we stay at E and original decision should not be retracted. So, in 2 out of 3 cases no change to decision. It explains why are people natively conservative.
Most problems are at information from other sources without verification, or when facts are not taking into consideration, or when falsification is not possible.
In regard to common perception and religion:
Religious people claim they have knowledge at A or sometime at B. in fact they have grade F. (looks high, but it isnt. Anything on grade E or lower isnt worth spreading)
Skeptics usually usually have and discuss information at grade B and D, if they are scientists or debunkers its A and C - you just need to take part in verification and eventual falsification.

Thanks for trying. I can barely keep up with it as an English speaker. It uses the term “skeptic” in ways that get confusing, so you’re definition, where a skeptic should operate at your B or D level or above is not what they mean. Usually, when the article says “skeptic”, it means someone who is requiring a high level of proof for things that are normally taken for granted.
In the zebra case, the skeptic is asking that you consider highly unusual things, like donkeys that could be disguised as zebras. It is assumed that the zebra being discussed looks like a regular zebra and they are somewhere that zebras are found, like Africa or a zoo. The skeptic doesn’t prove anything. Instead of showing that the person can’t identify zebras, he only shows that he can’t tell if a zebra is disguised. There are many reasons he can’t know that, he has never heard of such a thing and has no idea how easy it is to do or how well it can be done.

Theoretical philosophy. Without knowing and understanding the context it appears as it leads to nowhere, while terms such as skeptic might be slightly different from commonly used terms.
(To clarify, sketpic - such as us - is one because of fact based approach to life. It differs from skeptic - a philosophical position)
To me it seems that the authors who proposed this stance are making an assumptions and decisions purely on doubt like “the flight schedule might be misprinted”.

Theoretical philosophy. Without knowing and understanding the context it appears as it leads to nowhere, while terms such as skeptic might be slightly different from commonly used terms. (To clarify, sketpic - such as us - is one because of fact based approach to life. It differs from skeptic - a philosophical position) To me it seems that the authors who proposed this stance are making an assumptions and decisions purely on doubt like "the flight schedule might be misprinted".
The name of the discussion includes "contextual". That's what the whole discussion is about. What it's showing is how you can recognize when someone is trying to prove something but is really only shifting the context. By raising doubt, it can appear that you have made an argument against something, like say global warming, but all you have really done is raised doubts about what "scientific consensus" means or how it is calculated or that scientists actually have a different motivation than the one you originally assumed.

Attempts to derail discussion to “what is a scientific consensus” are bit blatant. Rule of the thumb is that its important to distinguish between facts and opinions.
Lets say there is a scientist who has claims about climate change, that current trend is “warming”. I ask him if the model reflects known volcanic eruptions in 19th century and their effect on climate (it was colder).
Truthful answer is “yes” because meteorological measurements definitely contain that data. I would accept “i dont know”, but “i believe its not a factor” isnt what I asked for.
Exchange started in a manner that he provided an information, i provided another information and asked if it was considered, if the answer is just opinion it becomes clear that facts are less important. If i would provide an opinion in a manner “i believe there is a reason why was colder in 19th century” but i do not provide any reason or information = only opinion i would not argue about that.

If i would provide an opinion in a manner "i believe there is a reason why was colder in 19th century" but i do not provide any reason or information = only opinion i would not argue about that.
It would be nice if this was just about things I could simply step away from or change the subject to football, but these context shifting techniques are used by the people currently in charge of determining carbon taxes and environmental law and the building of new pipelines. Not arguing is not really an option.
If i would provide an opinion in a manner "i believe there is a reason why was colder in 19th century" but i do not provide any reason or information = only opinion i would not argue about that.
It would be nice if this was just about things I could simply step away from or change the subject to football, but these context shifting techniques are used by the people currently in charge of determining carbon taxes and environmental law and the building of new pipelines. Not arguing is not really an option. Usually, what is perceived by skeptics or scientists as a distraction is used as a 2nd level of communication between people in public who are aware they share a goal and that there are other people who share the same goal. Those people dont share that kind of honesty ... such like Penn for example. Usually they mistook honesty for naivity.