In the noble past, when photography was done on film, this was a well understood problem. Now that it's all done by computers, photographers have forgotten the basics.
This is a phenomenon called 'anomalous reflection'. It's a problem that occurs when photographing objects that are at the blue end of the spectrum, and comes about because the spectral sensitivity of the eye is different from the spectral sensitivity of the film (or digital sensor). A more common example is that photographs of Morning Glory flowers never are the same color we see when we view the flowers directly.
Hopping on the conversation:
This is still an issue with digital, because as sensor limitation, but also because of color space issues.
I suspect that the purple in question is outside of sRGB color gamut. This is a the most basic standard color space, and it's what is used most often.
Your camera shoots RAW images. Then it compresses them and converts them into Jpeg. When it does, it renders the colors into a specific color space, the most common one being sRGB. This space is recognized by virtually all displays, softwares, etc. Which is why it is the default color space for most cameras. However, it is quite limited: many deep shades, particularly purples, reds, some vibrAnt blues and yellows get compressed to less vibrant colors.
Your camera might allow you to save jpgs in the adobe rgb color space, which is much bigger and much better. But for your purposes it might actually make things worse: most software, including all popular browsers do not recognize assigned color spaces, and display all images as sRGB. Basically, the picture will probably look bad.
I think I tried to say too many things in a few paragraphs ( while typing on an iPhone!!) but you can google some of these terms to get a better picture (ah ah) or even better: ask me, I'd love to be of help.