Sounds like you would need to rebuild the raid array to correct for the broken pairing of the member drives. If you have never done this before, I would advise giving the task to someone else to do who knows what they are doing as for you can make matters worse very quickly!!!!!!!!!!!!!!!!!!!!!!!
I have dealt with having to go through this before such as on our Exchange Server and it was a very tense process being sure to be extremely careful as for one wrong move and your looking at thousands of lost e-mail etc.
Here is the one FACT from this site linked below that is a MUST know for anyone working with RAID5
Beyond the seemingly obvious facts, there is ONE fundamental law that should be tattooed on the brain of every computer system operator responsible for RAID-5 data storage:
A RAID-5 rebuild procedure will ALWAYS FAIL unless ALL of the sub- system components, i.e. the controller and all hard disk drive members are present and accounted for, this means: verifiably working properly.
Therefore, if you cannot now access and/or backup all the data stored on your RAID-5, do not attempt to rebuild. If you can back it up, then get that done first. Failed rebuild attempts are lamentable and make successful data recovery more difficult and potentially impossible.
http://www.recover-raid.com/failed-RAID-help.htmlHopefully your employer is understanding and your still there as for a mistake like this to some businesses = loss of job depending on the business and how critical of a mistake was made.
My worst mistake I ever made when I got my first IT job back in the 90s was accidentally routing a patch cable looped back on the network creating a network storm. The closet that had a rats nest of cables with all the same color and nothing labeled and some plugged into switch and patch panels just dangling but not removed, I got confused between the cable I was fishing through and a cable end that was in the rats nest of Cat5. I accidentally plugged a different cable that came out of say port 21 on a 24 port switch to say port 12. I then walked away when I saw the status LED light. Within about 10 minutes I was getting calls from everywhere that the network was down. I then went back to where I was working and the status LEDs were blinking rapidly. My biggest problem was that I didn't remember that I plugged into port 12 on the switch and the cable wasn't labeled. I had to start at port 1 and follow the cable through the rats nest and make sure it went where it belonged. And finally when I got to port 12 and followed it and found out that it went to port 21 of the 24 port switch I disconnected the looped back on itself network cable patch. I then had to reboot switches to get the storm to end and this then brought the network up healthy again. Connected my laptop to a port on the switch and did a quick network test and all was well. I then called my boss to report that the problem was solved and he notified the users to get back to work the network was back up. I then had to this time be extremely careful to plug the patch cable that I was initially running into the switch and that it was the correct cable end being fished through the tangled mess of cables that someone before me didn't believe in cable management. I then had to explain to my boss what I did wrong and sorry for bringing down the company network for almost an hour. Fortunately my boss understood and I didn't lose my job after only had been there for like my 34th day of employment in which the first 90 days is the most critical to proving yourself. I was then given a late night job to correct the wire mess when everyone else was sleeping and route the cables properly and label them with a labeler.