Monday, August 07, 2017

The revocation function was unable to check revocation because the revocation server was offline. 0x80092013

I know I haven’t blogged in a while but I just spent all day on the oddest of issues when deploying a Microsoft Active Directory Certificate Services Enterprise Subordinate Certificate Authority.  I had to renew the subordinate certificate authorities certificate and upon doing so, the service would fail to start because it claimed that the revocation server was offline.

That takes me to why I had to renew the certificates in the first place.  The signing certificates were issued while I was on vacation, before the certificate distribution point, online certificate status protocol, and authority information access configurations were done to the intermediate CA (it is a three tiered model).

Active Directory Certificate Services did not start: Could not load or verify the current CA certificate.  ISSUINGCA01 The revocation function was unable to check revocation because the revocation server was offline. 0x80092013 (-2146885613 CRYPT_E_REVOCATION_OFFLINE).

To make sure that was really the problem, I temporarily disabled revocation checking:

Certutil.exe –setreg ca\CRLFlags +CRLF_REVCHECK_IGNORE_OFFLINE
That worked – but for reasons that should be obvious if you are reading this, I can’t leave it that way.

All of the certutil.exe –verify –urlfetch commands all worked.  Certutil.exe –url worked too, meaning, everything verified.  But I still could not get the service to start with revocation checking enabled, it would always fail.  Rebooting, several times, didn’t seem to help either.

I moved on to certutil.exe –urlcache, specifically with the * delete option.  That is something I’ve used in the past for solving some pesky issues with revocation checking, but my first attempts didn’t seem to resolve the problem either.  I took a break, and grabbed some lunch and thought to myself that the certutil.exe –urlcache is user based, so no amount of * delete run in the user context would help with the service.  So, out comes psexec.  Using psexec.exe –i –s cmd.exe to bring up a command prompt running as the local system, I saw a whole new list of entries with certutil.exe –urlcache.  I was on to something, so out comes the certutil.exe –urlcache * delete and I was able to start the certificate authority service.

It takes a lot to really baffle me with Windows these days, but this one had me confused for hours.  Windows PKI with Active Directory Certificate Services really isn’t that complex, I’ve deployed several, but when mistakes were made and need to be cleaned up, I was in unfamiliar territory.

Sunday, July 14, 2013

Thoughts on Congressional Term Limits

A continuation from my article five post yesterday, should there be term limits for representatives and senators?  While it sounds like a good idea on its surface, I also consider why the founding fathers did not write term limits into the Constitution of the United States.

The men who wrote our constitution were familiar with term limits, many had studied the ancient democracies of the world of which many included some sort of office rotation, either by law, or by tradition.  Some of our colonial assemblies also employed term limits.  Thomas Jefferson proposed term limits, "to prevent every danger which might arise to American freedom by continuing too long in office the members of the Continental Congress".  And George Mason said (or wrote), "nothing is so essential to the preservation of a Republican government as a periodic rotation".  It wasn't just the Virginians who embraced term limits, Benjamin Franklin of Pennsylvania also embrace term limits and executive rotation.  Not all of our founders are so well documented, but they had just overthrown a perpetual monarch and desired to set up a form of limited government, it would not be unreasonable to think that many would consider term limits a good idea. The Articles of Confederation included term limits for the president of the congress, "no person be allowed to serve in the office of president more than one year in any term of three years"

So, why were term limits left out?  None of our founders left a note explaining this, so we can only speculate. One reason was tradition, elected colonial assemblymen generally didn't serve many consecutive terms, they had farms and families to tend to, it just wasn't practical to be a perpetual politician.  Paragraph one, Section two, Article one, of the Constitution of the United States is often cited as the reason why term limits were not needed.  It seems logical that the two year turn over of the house through direct elections was, at least at the time, an adequate solution to ensure periodic rotation.
The House of Representatives shall be composed of Members chosen every second Year by the People of the several States, and the Electors in each State shall have the Qualifications requisite for Electors of the most numerous Branch of the State Legislature.
I tend to agree with what I think of as the decision of delegates and states who signed then ratified the Constitution of the United States.  Every two years, the vast majority of the United States Congress can be replaced.  The ballot box is the most democratic method of term limits.  So why doesn't it happen.  The short answer is, it does, at least every once in a while.  The mid-term elections in 2010 saw a large portion of the congress replaced, as it did a few years before in 2006.  Polls show that 75% of Americans support term limits in general, but when it comes to those who represent them, they still generally prefer the incumbents.  I suppose it is easy to support term limits for the other guy.

So, should there be congressional term limits?  I'm not really sure, I suppose it would depend on what the limits are, the people are entitled to effective representation, which could be curtailed with limits or durations that are too short, but very lengthy limits could have their own pitfalls.  Perhaps I will consider the ideal term limits in the future.

Saturday, July 13, 2013

How Could Congress be Term Limited?

This is not an argument for or against term limits, simply an explanation of how it could be done.

Many people often complain that representatives and senators should be term limited, only being allowed to serve a set number of terms sort of like the president.  The only way for congress to be term limited is through one or more constitutional amendments.  The method that all other amendments were proposed requires two thirds of the congress to deem it necessary, and what are the chances that they'll deem necessary limits on their terms?

Article V, the Constitution of the United States
The Congress, whenever two thirds of both Houses shall deem it necessary, shall propose Amendments to this Constitution, or, on the Application of the Legislatures of two thirds of the several States, shall call a Convention for proposing Amendments, which, in either Case, shall be valid to all Intents and Purposes, as Part of this Constitution, when ratified by the Legislatures of three fourths of the several States, or by Conventions in three fourths thereof, as the one or the other Mode of Ratification may be proposed by the Congress; Provided that no Amendment which may be made prior to the Year One thousand eight hundred and eight shall in any Manner affect the first and fourth Clauses in the Ninth Section of the first Article; and that no State, without its Consent, shall be deprived of its equal Suffrage in the Senate.
Remember, the Constitution of the United States was written to give the states the control, and this little known passage in article five, is in my opinion, the only way that an amendment setting term limits on congress could be proposed.

Wednesday, April 24, 2013

E-Mail File Filtering, was that a GZIP file?

One of my customers had a problem, an e-mail was being blocked by their Microsoft Forefront Protection 2010 for Exchange Server file filter but they couldn't quite figure out why.  The e-mail's attachment did not seem to match any of the restricted file types, yet it was still being blocked.  To make matters worse, their file filter list included several file headers that would be blocked but the log only noted the filter list that blocked the file, not which filter list entry was triggering it.

I first had to break down the filter list into many filter lists with the following command.

foreach ($a in (Get-FseFilterList -File -List "BlockFiles").FileType){New-FSEFilterList -File -List $xyz -Item "*" -Filetype $a}

If you are familiar with Forefront Protection for Exchange, you know that its powershell commands aren't that great, and the above command created a bit of a mess as all of the filter lists were disabled and none of the action and notification settings were default, which wasn't what I wanted so I had to click away in the graphical user interface a bit then disable the larger filter list.

Once all the filter lists were ready I resent the e-mail, and sure enough, it was blocked, but this time I knew that the attachment had GZIP file headers, because that is the filter list that the log  flagged as quarantining the message.  There was just one problem, there was nothing resembling a GZIP file attached to the message.  The file that was causing the trouble was an image file with a .EMZ extension.

After a quick bing, I learned that a .EMZ file was a Microsoft Office image format known as Windows Compressed Enhanced Metafile which uses GZIP for compression.  It is really a GZIP file, in fact, you can open it with a compression tool to extract the .EMF file, Enhanced Metafile, inside.

Mystery solved.  Since the customer wants to allow .EMZ files, the filter list entry for the GZIP header was removed from the main filter list, but since they still wanted to block GZIP files, a new filter list was created to block GZIP files under their common file names.

Friday, September 07, 2012

Put your Exchange lab on the IPV6 Internet

I've been working on a project with a United States Federal Government customer that requires their e-mail system to operate on the public IPv6 Internet. I've spent a lot of time waiting for the customer's network to be configured for IPv6, but now that the deadline is near I realized that putting an Exchange system on the Internet is one of the few things that I've never done before and I didn't want to do it for the first time on a customer system and that I would want a way to reliably test the customer's configuration. Putting my Microsoft Exchange 2010 lab onto the IPv6 public Internet would solve both problems.

IPv6 Tunnel

My lab is on a small business connection from the local cable company. We have a small range of fixed IPv4 addresses, but no IPv6 support. I solved this problem by obtaining a free IPv6 tunnel from Hurricane Electric Internet Services They make it easy to configure, and even give you the netsh commands to configure your server to use the tunnel.

netsh interface teredo set state disabled
netsh interface ipv6 add v6v4tunnel IP6Tunnel
netsh interface ipv6 add address IP6Tunnel 2001:470:7:305::2
netsh interface ipv6 add route ::/0 IP6Tunnel 2001:470:7:305::1

Since my server is NATted, I had to use the internal IPv4 address in the netsh command.

Exchange 2010

I would like to say that there was a lot of configuration involved, but it just worked. I had learned earlier that Exchange 2010 automatically looks for AAAA records when doing DNS delivery. This lab environment is a pretty basic install, a single multi-role Exchange 2010 on Windows Server 2008 R2. Of course, to receive e-mail, you will need to have an MX record. The tunnel provider assigns a quad A record to your IPv6 address, I simply changed the MX record to point to that FQDN. MX preference = 10, mail exchanger =

Address: 2001:470:7:305::2

Of course, you could configure your own AAAA record if you'd like (I did this too).

I did some additional experimenting with mixing IPv4 & IPv6 in the MX record. In doing so, I also created a separate receive connector for IPv6 to make it easy to figure out what was what. Testing Testing my lab was also pretty easy. First, gmail supports IPv6, so when I sent my first e-mail to gmail, the headers were all IPv6! Additionally, there are some test reflectors out there ( & that are only supposed to receive e-mail on IPv6 and will send a response with the headers, both use IPv6 for the reply by default, so looking at the headers of their replies will let you know if your IPv6 is working. In addition, I have configured protocol logging on my send and receive connectors to verbose, enabling me to verify in the logs that IPv6 was being used for message transfer (you can also see this in the message tracking log).


Putting my lab on the IPv6 public internet was easier than I thought. More and more customers are considering IPv6 functionality, and many US government agencies are under a mandate to support IPv6 on their public facing systems. Even though it is easy, there is no reason to do it in production for the first time, put your lab on IPv6 and experiment, gaining experience as you go.

Wednesday, September 05, 2012

IPV6 Newbie

Just taking some time to brush up on the IPV6 for a work project. IPv6 Certification Badge for JosephDurnal

Friday, August 24, 2012

Exchange Pickup Directory Transport Size Limit

I ran into a problem on a well established Microsoft Exchange 2010 installation that has implemented the edge server role and Microsoft Forefront Protection 2010 for Exchange Server.

The administrator was trying to release a message from the Froefront quarantine but it wasn't going trough.  A DSN was sent back to the forefront e-mail address with the status of 550 5.3.4 PICKUP.MessageSize; message size exceeds a fixed maximum size limit with the same information under a fail event in the message tracking log.

This system has been up and running for quite some time with a 25 megabyte limit regularly passing messages of that size, but this message was only a little over 10.  It seemed a little strange, I had configured it during implementation and everything seemed in order, but obviously, something wasn't quite right. 

The answer ended up being in the set-transportconfig cmdlet and the maxsendsize argument.  While on the internal exchange organization, that was set way back when, it wasn't done on the edge transport servers.  It didn't seem to be a problem when the server is doing its thing, transferring e-mail to and from the Internet. 

To resolve the problem on Microsoft Exchange 2010 Edge Transport server where pickup directory messages exceed the maximum message size, use set-transportconfig -maxsendsize 25MB (or whatever size you need). 

Saturday, July 28, 2012

New pictures of the kids

I updated the header of my blog with new pictures of the children :) finally.  The "new" pictures are about a year old now but that's OK, they are a lot more up to date than the pictures from 2008!

Here are the old ones

Quite the difference!

Monday, June 18, 2012

Tracking e-mails to invalid addresses

Customer request: Can you let me know what was e-mailed to an address that doesn't exist on our system? 

Consultant answer: Maybe, but any information on these e-mails will be limited to the source IP and from address. 

Most e-mail systems don't save much when an e-mail is received for a user that doesn't exist, since the recipient e-mail address is sent early in the establishment of the SMTP connection, most e-mails are dropped as soon as the server receives the invalid e-mail address.  On Microsoft Exchange Server 2010, the SMTP connection is logged through verbose protocol logging on the connectors, if this is enabled, you can get a little bit of information on e-mails sent to invalid addresses.

I whipped up the following command for doing just that.

for /f %a in ('dir /b') do for /f "delims=, tokens=3" %b in ('type %a ^| find /i ""') do for /f "delims=, tokens=1,8" %c in ('type %a ^| find "%b" ^| find "MAIL FROM"') do echo %c - %d >>; unknownuser.log
I'll break it down

for /f %a in ('dir /b')
This one should be easy, dir /b, a basic directory listing and store it in a variable, you'll see %a in the next line.  I ran the script from c:\Program Files\Microsoft\Exchange Server\V14\TransportRoles\Logs\ProtocolLog\SmtpReceive\

for /f "delims=, tokens=3" %b in ('type %a ^| find /i ""')
This one is a little more complex, using the %a from above, I type each file piping to find, the e-mail address I'm looking for.  I'm only looking for the connection's session identifier in the comma delineated file, hence the tokens=3, which I store in another variable.

for /f "delims=, tokens=1,8" %c in ('type %a ^| find "%b" ^| find "MAIL FROM"')
Using the file names in %a from above, plus the session ID I've stored in %b, I type the files again, limiting it to only the session ID I'm looking for, filtering it further to the MAIL FROM entry.  This will get me the e-mail address that tired to send the e-mail.  This time I want different fields, 2 this time, the date-time and the data, which will give me some useful data in my variables %c and %d.  Notice that I don't define %d, the for command does that for me.

echo %c - %d >> unknownuser.log
finally, take the %d and %d from above and write them to a file

I actually started doing this in powershell, but it was taking longer than I thought it should.  Since I've done a lot of log parsing with the for command, I decided to go to what I know and use the old DOS commands.  My colleague is working on a powershell version, if he posts it to his blog, I'll give him a link. 

Wednesday, April 11, 2012

The MCSE Returns

Just about everyone in the technology field knows about the MCSE or Microsoft Certified Systems Engineer, once one of the most respected certifications in the industry that lost its luster to brain dumps and rampant cheating. I earned my first MCSE back in the 90's, before they made it a little easier but near the end of it's era of respectability. The certificate is still displayed on my office wall.

The MCSE's decline was realized and replaced with a new certification system, the MCTS, Microsoft Certified Technology Specialist and the MCITP, Microsoft Certified Information Technology Professional. These certifications are an improvement on the old MCSE/MCP certs. I hold several MCTS and three MCITP certifications.

Enter the new MCSE, the Microsoft Certified Solutions Expert, part of the new Certification Version 3, CertV3 effort. This new certification system will be based on a core set of technologies with expert specializations in areas like Exchange, Lync, and SharePoint. These new MCSE exams are supposed to measure real 300 level knowledge and should be more challenging which should hopefully yield better qualified "experts". Only time will tell if cheating and brain dumps will dilute the value of this "expert" certification, or if the term MCSE, with its damaged reputation, will still be considered cheap.

One thing is for sure, the MCM, Microsoft Certified Master, will still be the certification that differentiates the experts from the masters.

Monday, August 08, 2011

Listed as a Microsoft Certified Master, Finally!

After making a few mistakes in getting my information to the right person to update Microsoft's corporate web site, I finally have my name and company listed on the "Meet the MCM's & MCA's" web site.

It was only mildly embarrassing to introduce myself as a Microsoft Certified Master, only to be questioned about why my name wasn't on the list. I didn't have a good excuse like "company policy prohibits it" or anything like that. The good news is that I have the nice plaque and certificate to back up my statement, but now, I can be verified on the web!

Friday, July 29, 2011

Day 5 @ camp Ross

It is the last day, well, sort of, we will be here one more night and leaving first thing in the morning. The boys did lashings at scoutcraft this morning which Alex seemed to like. Alex enjoyed making a candle this morning at handicraft, I'm wondering if it will melt in this heat. It has been hot this week, but today is the hottest at about 97 degrees. They worked on aquanaut and the boys that passed their swim test as swimmers earned their activity badge. Now, it is three hours of open program and once again I have no idea what Alex is up to as he is out with a buddy visiting various program areas. Looking forward to tonight's campfire!

Thursday, July 28, 2011

Have we been at Goshen 5 days already?

Starting out with last nights Catholic mass, it was awesome, the pastor took the time to explain setting up an alter and making the space we were in like a church. A good homily about growing boys at camp. Alex missed it because he wanted to go with his buddies to the interfaith service.

This morning they worked on readyman, more on their service project, and visited camp Bowman to see an actual boy scout camp. Alex also finished his pack 195 sign for the dining hall. Now they are at the Jolly Rock swimming hole which Alex is enjoying immensely. Later today they are going to nature for geologist and they are planning to shoot some more bbs if they make it back from Jolly Rock in time. After dinner there will be some pioneer themed games.

Wednesday, July 27, 2011

Day 4 at camp Ross

Yeah, I have two day 2 posts. Today was the big hike to viewing rock which we did after boating in the morning where he paddled a canoe and a funoe. I ended up rowing another scout around taking pictures. This afternoon he did swimming, bbs, and archery. He spent the whole time in the swimmers area and was able to improve his shooting. Right now he's out at open program with a buddy and I have no idea what he is up to.