Wednesday, March 20, 2019

Windows Time Service Configuration for Amateur (ham) Radio

Microsoft has had a built in time service since at least Windows 2000 but it isn't exactly easy to use.  Amateur Radio Operators for the most part have not really needed to the second accuracy until recently with the advent of the WSJT and WSJT-X modes like JT9 and FT8 which is exploding in popularity.  I've seen so many posts on forums and web sites about what software to use to set accurate time on your Windows computer.  Some advice is just plain wrong, while others suggest third party software that I'm sure works fine, but just isn't necessary.

Here is what worked for me:

Run a command prompt as administrator

First - set the service to automatic

sc config w32time start=auto

Configure the time service to use to use a manually configured set of NTP servers

w32tm /config /manualpeerlist:",0x8,0x8" /syncfromflags:manual /reliable:yes /update

By default, a windows client will slowly try and bring the clock into compliance, but I've found this to be too slow.

reg add HKLM\System\CurrentControlSet\Services\w32time\config /v UpdateInterval /t REG_DWORD /d 64

Your best source for understanding the Windows Time Service tools and settings is here:

Please note, if you are a Windows system administrator, this is not the way to do it on our Active Directory domain clients!  You'd do this only the PDC emulator of the root domain in your forest!

Monday, January 28, 2019

Elecraft K1 Progress

I was a much more active blogger the last time I worked on the Elecraft K1.  A lot of progress has been made.  I've completed the soldering!  I'm just waiting for a time to power it up and test.  I want to have several hours dedicated just in case something else is wrong and I have to figure it out and fix it.

Sunday, January 28, 2018

Whistler TRX-1 Digital Scanner Radio Handheld Review


The Whistler TRX-1 Digital Scanner Radio Handheld is one of the latest on the market and one of the few if not only scanners available that does DMR (Digital Mobile Radio) and NXDN (Next Generation Digital Narrowband) modes, as well as Project 25 Phase 1 & 2.  Overall, I am pleased with this scanner and would buy it again knowing everything that I have learned so far after using it for a couple of weeks.  It has been a long time since I have owned a scanner, the last one being bought before digital transmissions were common, so I don’t' really have much to compare it to.  I would have to give at an 8.5 out of 10.

 While the Capitol Police are encrypted, I had no problem listening to the Metro Transit Police and DC Fire Department as I walked around the core of Washington DC.   By the way, encrypted transmissions give you a telephone like busy signal and a highlighted E on the display. 

Frequency Coverage

Covering almost all frequencies from 25 to 1300 megahertz picks up everything from the Citizens Band top of the 23 cm Amateur Radio band.  It skips cellular and broadcast radio and television, which is pretty standard for scanners.  In all, it is pretty standard for scanners in its price range.  I've yet to try it down in the HF band, or beyond the  800 MHz public safety bands, but from about 46 to 870 MHz it has exceeded my expectations both with receiving neighboring county systems, and its performance in the dense RF environment of downtown DC.  Coverage more common today than when I last purchased a scanner is military bands in the 300 MHz range.  Both aircraft and base trunked systems can be found there in the 216-420 MHz range that I had never scanned before. 

Analog and Digital Modes

This radio receives almost every mode you'll find in use in government and business, as well as many hams.  AM, FM, NFM, FM-MOT (Motorola), LTR (EF Johnson), CTCSS, DCS, NAC on P25, EDACS wide/narrow (GE/Ericsson/HARRIS), P25-Phase I, X2-TDMA, P25-Phase II, DMR, MotoTRBO Tier II, & NXDN.  DMR is a must have these days as many businesses have been moving to this technology.  NXDN still seems a little rare but it is out there, and growing.  If you are interested in railroad communications, NXDN is a must have for you.  While the rollout is slow, NXDN is the future standard on the railroad.

Other Features

I really like that, although it is a Mini-B connection, it can be powered and charged through USB.  The NiMh batteries to take a while (overnight) to fully charge through the radio.  It can also be powered by alkaline batteries, but swapping out batteries is a bit of an ordeal as you have to remove the belt clip, antenna, and the rubberized case and it can drain them rather quickly.  It has a 3.5 mm jack for audio out which I have used to connect to the AUX port in my car.  This 3.5mm jack can be converted to a discriminator tap for use with software decoders. 

Programming - Basic

The TRX-1 comes loaded with a fairly up to date database from which allows basic programming to be done with just a few button presses to select your location by county or zip code.  This easy programming has its advantages, quick, easy, requiring little knowledge of the frequencies and digital systems in the area.  Disadvantages are that when you use this method of programming, you are kind of stuck with what they give you.  It works, but in large suburban counties or cities, the scan list gets pretty long and it can take while to scan all the way through back to the beginning.

Programming – Intermediate

Using the software, it is easier to customize the scan lists with conventional frequencies and trunked system talk groups.  This is where you can set up the scanner for the way you want to use it.  I didn't find it completely intuitive, but it wasn't too hard to figure out, so, as software goes, it is about average.  I was able to set up my scan lists by county which suits my long commutes and frequent trips from Western Maryland to the Baltimore Area.

This is a screen grab of one of my scan lists that includes three trunked systems and a conventional frequency for Anne Arundel County Maryland.  It may be hard to see but I omitted the AAPD 11D Southern district police talk group as I don't really go to the south of the county, the same reason for eliminating AAFD 1C talk group.

I'm sure that there will be endless tweaking of my scan lists to try and find that sweet spot where I get to listen to what I find  interesting without wasting time scanning or listening to the more routine stuff.  I think my next project will be to create two scan lists for my home county, one for dispatch and major incidents, and the other for a lot of other types of traffic like, routine responses channels, highways, mall security, etc. 

Programming – Advanced

I haven't ventured into  this very much, but this scanner has the capability to program multiple virtual scanners that can be loaded from the supplied SD card.  When you load a different virtual scanner, it essentially wipes everything the memory and replaces it with something totally different, everything from settings, to conventional frequencies, to trunked systems.  I think this will be useful for experimenting with different settings and systems and for folks who travel.  I don't think it is for everyone, but it seems like a nice feature to have.

The Manual

The supplied manual was helpful, but it just didn't seem to be that great.  My understanding of the radio really improved when I found a manual online:  the Easier to Read Whistler WS1080/1088 Handheld Digital Scanner and Programming Software Manual.  It was written for an older model, but it really helped me figure this thing out.

More Resources

There is really only one place where I've found a large enough community to ask questions where people will know  the answer.  The Radio Reference forum on Whistler scanners.

A more comprehensive write up about the TRX-1 can be found here:

Friday, January 26, 2018

Two Way Forest Trust from One Side Access Denied

Starting with my first in 1996, I've created countless Windows domain and forest trust relationships over the years.  The only thing that has ever caused a problem for me when creating trust relationships is network connection issues, firewall, routing, etc.  This week, what was planned to be a routine forest trust didn't work.

In this scenario, I was to create a trust in a forest where I am an Enterprise Administrator and a remote forest where I have no access but they had already created their side of the trust and shared the trust password with me.  Using Active Directory Domains and Trusts and using the same procedure I've always used, and had just used in my lab, verified with Microsoft's guidance from TechNet: Create a Two-Way, Forest Trust for One Side of the Trust I was prompted for a username and password at a point where I'd never been asked for a username and password before, after entering the name of the remote forest (step 4 in the linked procedure) and before being prompted to select the trust type (step 5 in the linked procedure).  This was a problem since I wasn't going to get an account or access in the remote forest.

Thinking that it could have just been some sort of GUI anomaly and that it may work in the command line or at least give me some more diagnostic information, I switched to PowerShell.

$targetforest = ""
$trustpw = "password"
$Trustdir = "Bidirectional"
$localforest = [System.DirectoryServices.ActiveDirectory.Forest]::getCurrentForest()
Exception calling "CreateLocalSideOfTrustRelationship" with "3" argument(s): "Access is denied."
At line:1 char:1
+ $localforest.CreateLocalSideOfTrustRelationship($targetforest,$Trustdir ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : UnauthorizedAccessException 

It still didn't work, and essentially, I got the same error message.  After scouring the event logs on the local forest domain controller I found nothing to indicate what I was being denined access too.  I was still hesitant to blame the remote forest with what I had so far, but I started suspecting it.  Without access, it was hard to know for sure.

I used a combination of the Microsoft Message Analyzer and Process Monitor to get a better idea of what was going on.

Process monitor was really all I needed, here's the event:

High Resolution Date & Time: 1/26/2018 12:25:00.8212215 PM
Event Class: File System
Operation: CreateFile
Path: \\\PIPE\lsarpc
TID: 2152
Duration: 0.0005023
Desired Access: Generic Read/Write
Disposition: Open
Options: Non-Directory File
Attributes: n/a
ShareMode: Read, Write
AllocationSize: n/a

Message Analyzer also had an access denied entry that was helpful

Negotiate, Status: Success, ClientGuid: {xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx}, DialectRevision: SMB 3.0.2
SessionSetup, Status: Success, NTLM v1 with extended session security, Flags: 0
TreeConnect, Status: Success, Path: \\\IPC$, TreeID: 0x00000001, Capabilities:
Create, Status: STATUS_ACCESS_DENIED, FileName: lsarpc

These tools were more of a verification of what was happening with a little more detail to help find out why.  Process Monitor was telling me that the anonymous logon couldn't connect to lsarpc, but that is pretty standard.  Could it be that the remote forest EA has removed lsarpc from the anonymous access pipes?  I checked some standard security templates, which still had the lsarpc pipe accessible by the Anonymous Logon, but I thought it could still be a possibility so I tested it in my lab.

I removed lsarpc from their NullSessionPipes values under HKLM\system\currentcontrolset\services\lanmanserver\parameters on the domain controller representing the remote forest and rebooted.  This can also be done with group policy by removing lsarpc from the list in GPO_name\Computer Configuration\Windows Settings\Security Settings\Local Policies\Security Options Network access: Named Pipes that can be accessed anonymously. For the first time, I was able to reproduce the problem in my lab environment.

I tested two solutions to the problem, the first being returning the lsarpc entry to the NullSessionPipes values and the second was to use an enterprise admin account in the remote forest, and both allowed the trust to be created successfully.

This was kind of a strange issue and I wasn't able to find the answers with some in depth internet searching.  A lot of folks have trouble creating trusts in some way, but usually it is an easy problem like DNS, a firewall, or they aren't using an admin account, etc.

Monday, August 07, 2017

The revocation function was unable to check revocation because the revocation server was offline. 0x80092013

I know I haven’t blogged in a while but I just spent all day on the oddest of issues when deploying a Microsoft Active Directory Certificate Services Enterprise Subordinate Certificate Authority.  I had to renew the subordinate certificate authorities certificate and upon doing so, the service would fail to start because it claimed that the revocation server was offline.

That takes me to why I had to renew the certificates in the first place.  The signing certificates were issued while I was on vacation, before the certificate distribution point, online certificate status protocol, and authority information access configurations were done to the intermediate CA (it is a three tiered model).

Active Directory Certificate Services did not start: Could not load or verify the current CA certificate.  ISSUINGCA01 The revocation function was unable to check revocation because the revocation server was offline. 0x80092013 (-2146885613 CRYPT_E_REVOCATION_OFFLINE).

To make sure that was really the problem, I temporarily disabled revocation checking:

Certutil.exe –setreg ca\CRLFlags +CRLF_REVCHECK_IGNORE_OFFLINE
That worked – but for reasons that should be obvious if you are reading this, I can’t leave it that way.

All of the certutil.exe –verify –urlfetch commands all worked.  Certutil.exe –url worked too, meaning, everything verified.  But I still could not get the service to start with revocation checking enabled, it would always fail.  Rebooting, several times, didn’t seem to help either.

I moved on to certutil.exe –urlcache, specifically with the * delete option.  That is something I’ve used in the past for solving some pesky issues with revocation checking, but my first attempts didn’t seem to resolve the problem either.  I took a break, and grabbed some lunch and thought to myself that the certutil.exe –urlcache is user based, so no amount of * delete run in the user context would help with the service.  So, out comes psexec.  Using psexec.exe –i –s cmd.exe to bring up a command prompt running as the local system, I saw a whole new list of entries with certutil.exe –urlcache.  I was on to something, so out comes the certutil.exe –urlcache * delete and I was able to start the certificate authority service.

It takes a lot to really baffle me with Windows these days, but this one had me confused for hours.  Windows PKI with Active Directory Certificate Services really isn’t that complex, I’ve deployed several, but when mistakes were made and need to be cleaned up, I was in unfamiliar territory.

Sunday, July 14, 2013

Thoughts on Congressional Term Limits

A continuation from my article five post yesterday, should there be term limits for representatives and senators?  While it sounds like a good idea on its surface, I also consider why the founding fathers did not write term limits into the Constitution of the United States.

The men who wrote our constitution were familiar with term limits, many had studied the ancient democracies of the world of which many included some sort of office rotation, either by law, or by tradition.  Some of our colonial assemblies also employed term limits.  Thomas Jefferson proposed term limits, "to prevent every danger which might arise to American freedom by continuing too long in office the members of the Continental Congress".  And George Mason said (or wrote), "nothing is so essential to the preservation of a Republican government as a periodic rotation".  It wasn't just the Virginians who embraced term limits, Benjamin Franklin of Pennsylvania also embrace term limits and executive rotation.  Not all of our founders are so well documented, but they had just overthrown a perpetual monarch and desired to set up a form of limited government, it would not be unreasonable to think that many would consider term limits a good idea. The Articles of Confederation included term limits for the president of the congress, "no person be allowed to serve in the office of president more than one year in any term of three years"

So, why were term limits left out?  None of our founders left a note explaining this, so we can only speculate. One reason was tradition, elected colonial assemblymen generally didn't serve many consecutive terms, they had farms and families to tend to, it just wasn't practical to be a perpetual politician.  Paragraph one, Section two, Article one, of the Constitution of the United States is often cited as the reason why term limits were not needed.  It seems logical that the two year turn over of the house through direct elections was, at least at the time, an adequate solution to ensure periodic rotation.
The House of Representatives shall be composed of Members chosen every second Year by the People of the several States, and the Electors in each State shall have the Qualifications requisite for Electors of the most numerous Branch of the State Legislature.
I tend to agree with what I think of as the decision of delegates and states who signed then ratified the Constitution of the United States.  Every two years, the vast majority of the United States Congress can be replaced.  The ballot box is the most democratic method of term limits.  So why doesn't it happen.  The short answer is, it does, at least every once in a while.  The mid-term elections in 2010 saw a large portion of the congress replaced, as it did a few years before in 2006.  Polls show that 75% of Americans support term limits in general, but when it comes to those who represent them, they still generally prefer the incumbents.  I suppose it is easy to support term limits for the other guy.

So, should there be congressional term limits?  I'm not really sure, I suppose it would depend on what the limits are, the people are entitled to effective representation, which could be curtailed with limits or durations that are too short, but very lengthy limits could have their own pitfalls.  Perhaps I will consider the ideal term limits in the future.

Saturday, July 13, 2013

How Could Congress be Term Limited?

This is not an argument for or against term limits, simply an explanation of how it could be done.

Many people often complain that representatives and senators should be term limited, only being allowed to serve a set number of terms sort of like the president.  The only way for congress to be term limited is through one or more constitutional amendments.  The method that all other amendments were proposed requires two thirds of the congress to deem it necessary, and what are the chances that they'll deem necessary limits on their terms?

Article V, the Constitution of the United States
The Congress, whenever two thirds of both Houses shall deem it necessary, shall propose Amendments to this Constitution, or, on the Application of the Legislatures of two thirds of the several States, shall call a Convention for proposing Amendments, which, in either Case, shall be valid to all Intents and Purposes, as Part of this Constitution, when ratified by the Legislatures of three fourths of the several States, or by Conventions in three fourths thereof, as the one or the other Mode of Ratification may be proposed by the Congress; Provided that no Amendment which may be made prior to the Year One thousand eight hundred and eight shall in any Manner affect the first and fourth Clauses in the Ninth Section of the first Article; and that no State, without its Consent, shall be deprived of its equal Suffrage in the Senate.
Remember, the Constitution of the United States was written to give the states the control, and this little known passage in article five, is in my opinion, the only way that an amendment setting term limits on congress could be proposed.

Wednesday, April 24, 2013

E-Mail File Filtering, was that a GZIP file?

One of my customers had a problem, an e-mail was being blocked by their Microsoft Forefront Protection 2010 for Exchange Server file filter but they couldn't quite figure out why.  The e-mail's attachment did not seem to match any of the restricted file types, yet it was still being blocked.  To make matters worse, their file filter list included several file headers that would be blocked but the log only noted the filter list that blocked the file, not which filter list entry was triggering it.

I first had to break down the filter list into many filter lists with the following command.

foreach ($a in (Get-FseFilterList -File -List "BlockFiles").FileType){New-FSEFilterList -File -List $xyz -Item "*" -Filetype $a}

If you are familiar with Forefront Protection for Exchange, you know that its powershell commands aren't that great, and the above command created a bit of a mess as all of the filter lists were disabled and none of the action and notification settings were default, which wasn't what I wanted so I had to click away in the graphical user interface a bit then disable the larger filter list.

Once all the filter lists were ready I resent the e-mail, and sure enough, it was blocked, but this time I knew that the attachment had GZIP file headers, because that is the filter list that the log  flagged as quarantining the message.  There was just one problem, there was nothing resembling a GZIP file attached to the message.  The file that was causing the trouble was an image file with a .EMZ extension.

After a quick bing, I learned that a .EMZ file was a Microsoft Office image format known as Windows Compressed Enhanced Metafile which uses GZIP for compression.  It is really a GZIP file, in fact, you can open it with a compression tool to extract the .EMF file, Enhanced Metafile, inside.

Mystery solved.  Since the customer wants to allow .EMZ files, the filter list entry for the GZIP header was removed from the main filter list, but since they still wanted to block GZIP files, a new filter list was created to block GZIP files under their common file names.

Friday, September 07, 2012

Put your Exchange lab on the IPV6 Internet

I've been working on a project with a United States Federal Government customer that requires their e-mail system to operate on the public IPv6 Internet. I've spent a lot of time waiting for the customer's network to be configured for IPv6, but now that the deadline is near I realized that putting an Exchange system on the Internet is one of the few things that I've never done before and I didn't want to do it for the first time on a customer system and that I would want a way to reliably test the customer's configuration. Putting my Microsoft Exchange 2010 lab onto the IPv6 public Internet would solve both problems.

IPv6 Tunnel

My lab is on a small business connection from the local cable company. We have a small range of fixed IPv4 addresses, but no IPv6 support. I solved this problem by obtaining a free IPv6 tunnel from Hurricane Electric Internet Services They make it easy to configure, and even give you the netsh commands to configure your server to use the tunnel.

netsh interface teredo set state disabled
netsh interface ipv6 add v6v4tunnel IP6Tunnel
netsh interface ipv6 add address IP6Tunnel 2001:470:7:305::2
netsh interface ipv6 add route ::/0 IP6Tunnel 2001:470:7:305::1

Since my server is NATted, I had to use the internal IPv4 address in the netsh command.

Exchange 2010

I would like to say that there was a lot of configuration involved, but it just worked. I had learned earlier that Exchange 2010 automatically looks for AAAA records when doing DNS delivery. This lab environment is a pretty basic install, a single multi-role Exchange 2010 on Windows Server 2008 R2. Of course, to receive e-mail, you will need to have an MX record. The tunnel provider assigns a quad A record to your IPv6 address, I simply changed the MX record to point to that FQDN. MX preference = 10, mail exchanger =

Address: 2001:470:7:305::2

Of course, you could configure your own AAAA record if you'd like (I did this too).

I did some additional experimenting with mixing IPv4 & IPv6 in the MX record. In doing so, I also created a separate receive connector for IPv6 to make it easy to figure out what was what. Testing Testing my lab was also pretty easy. First, gmail supports IPv6, so when I sent my first e-mail to gmail, the headers were all IPv6! Additionally, there are some test reflectors out there ( & that are only supposed to receive e-mail on IPv6 and will send a response with the headers, both use IPv6 for the reply by default, so looking at the headers of their replies will let you know if your IPv6 is working. In addition, I have configured protocol logging on my send and receive connectors to verbose, enabling me to verify in the logs that IPv6 was being used for message transfer (you can also see this in the message tracking log).


Putting my lab on the IPv6 public internet was easier than I thought. More and more customers are considering IPv6 functionality, and many US government agencies are under a mandate to support IPv6 on their public facing systems. Even though it is easy, there is no reason to do it in production for the first time, put your lab on IPv6 and experiment, gaining experience as you go.

Wednesday, September 05, 2012

IPV6 Newbie

Just taking some time to brush up on the IPV6 for a work project. IPv6 Certification Badge for JosephDurnal

Friday, August 24, 2012

Exchange Pickup Directory Transport Size Limit

I ran into a problem on a well established Microsoft Exchange 2010 installation that has implemented the edge server role and Microsoft Forefront Protection 2010 for Exchange Server.

The administrator was trying to release a message from the Froefront quarantine but it wasn't going trough.  A DSN was sent back to the forefront e-mail address with the status of 550 5.3.4 PICKUP.MessageSize; message size exceeds a fixed maximum size limit with the same information under a fail event in the message tracking log.

This system has been up and running for quite some time with a 25 megabyte limit regularly passing messages of that size, but this message was only a little over 10.  It seemed a little strange, I had configured it during implementation and everything seemed in order, but obviously, something wasn't quite right. 

The answer ended up being in the set-transportconfig cmdlet and the maxsendsize argument.  While on the internal exchange organization, that was set way back when, it wasn't done on the edge transport servers.  It didn't seem to be a problem when the server is doing its thing, transferring e-mail to and from the Internet. 

To resolve the problem on Microsoft Exchange 2010 Edge Transport server where pickup directory messages exceed the maximum message size, use set-transportconfig -maxsendsize 25MB (or whatever size you need). 

Saturday, July 28, 2012

New pictures of the kids

I updated the header of my blog with new pictures of the children :) finally.  The "new" pictures are about a year old now but that's OK, they are a lot more up to date than the pictures from 2008!

Here are the old ones

Quite the difference!

Monday, June 18, 2012

Tracking e-mails to invalid addresses

Customer request: Can you let me know what was e-mailed to an address that doesn't exist on our system? 

Consultant answer: Maybe, but any information on these e-mails will be limited to the source IP and from address. 

Most e-mail systems don't save much when an e-mail is received for a user that doesn't exist, since the recipient e-mail address is sent early in the establishment of the SMTP connection, most e-mails are dropped as soon as the server receives the invalid e-mail address.  On Microsoft Exchange Server 2010, the SMTP connection is logged through verbose protocol logging on the connectors, if this is enabled, you can get a little bit of information on e-mails sent to invalid addresses.

I whipped up the following command for doing just that.

for /f %a in ('dir /b') do for /f "delims=, tokens=3" %b in ('type %a ^| find /i ""') do for /f "delims=, tokens=1,8" %c in ('type %a ^| find "%b" ^| find "MAIL FROM"') do echo %c - %d >>; unknownuser.log
I'll break it down

for /f %a in ('dir /b')
This one should be easy, dir /b, a basic directory listing and store it in a variable, you'll see %a in the next line.  I ran the script from c:\Program Files\Microsoft\Exchange Server\V14\TransportRoles\Logs\ProtocolLog\SmtpReceive\

for /f "delims=, tokens=3" %b in ('type %a ^| find /i ""')
This one is a little more complex, using the %a from above, I type each file piping to find, the e-mail address I'm looking for.  I'm only looking for the connection's session identifier in the comma delineated file, hence the tokens=3, which I store in another variable.

for /f "delims=, tokens=1,8" %c in ('type %a ^| find "%b" ^| find "MAIL FROM"')
Using the file names in %a from above, plus the session ID I've stored in %b, I type the files again, limiting it to only the session ID I'm looking for, filtering it further to the MAIL FROM entry.  This will get me the e-mail address that tired to send the e-mail.  This time I want different fields, 2 this time, the date-time and the data, which will give me some useful data in my variables %c and %d.  Notice that I don't define %d, the for command does that for me.

echo %c - %d >> unknownuser.log
finally, take the %d and %d from above and write them to a file

I actually started doing this in powershell, but it was taking longer than I thought it should.  Since I've done a lot of log parsing with the for command, I decided to go to what I know and use the old DOS commands.  My colleague is working on a powershell version, if he posts it to his blog, I'll give him a link. 

Wednesday, April 11, 2012

The MCSE Returns

Just about everyone in the technology field knows about the MCSE or Microsoft Certified Systems Engineer, once one of the most respected certifications in the industry that lost its luster to brain dumps and rampant cheating. I earned my first MCSE back in the 90's, before they made it a little easier but near the end of it's era of respectability. The certificate is still displayed on my office wall.

The MCSE's decline was realized and replaced with a new certification system, the MCTS, Microsoft Certified Technology Specialist and the MCITP, Microsoft Certified Information Technology Professional. These certifications are an improvement on the old MCSE/MCP certs. I hold several MCTS and three MCITP certifications.

Enter the new MCSE, the Microsoft Certified Solutions Expert, part of the new Certification Version 3, CertV3 effort. This new certification system will be based on a core set of technologies with expert specializations in areas like Exchange, Lync, and SharePoint. These new MCSE exams are supposed to measure real 300 level knowledge and should be more challenging which should hopefully yield better qualified "experts". Only time will tell if cheating and brain dumps will dilute the value of this "expert" certification, or if the term MCSE, with its damaged reputation, will still be considered cheap.

One thing is for sure, the MCM, Microsoft Certified Master, will still be the certification that differentiates the experts from the masters.