Care to Comment

You are an obvious obfuscator who never adds to the conversation.

What I said about deceleration and amplification here is correct, yet you insist on making feigned posts about its accuracy.

Take a hike.
Incorrect on all points again, Tony. At least you're consistent.
 
The Sauret video that the data was taken from is 30 frames per second and if one took data every frame it would be extremely noisy. Even with the measurements taken every 167 milliseconds some noise will still be present.

Symmetric differencing does generate an average about a data point and thus will smooth out noise. However, when charting data like this it is the trend which is significant.

Using every data point by simple differencing essentially doubles the noise.

The question one needs to ask is why the average between two data points on either side of the point in question causes a higher velocity than the previous average. How did it get to be greater?

This is the reason for regression analysis.

I know some will say that the measurement resolution is not sufficient to discern whether or not a jolt took place, but that is a feigned argument for two reasons: One is that the trend is obviously increasing and two the size of the jolt required is much higher than what could be masked by one data point.

The premise of the Missing Jolt paper is valid for the reasons stated above.

In any case, we will be redoing the measurements with a more sophisticated system called Tracker, which is in the Open Source Physics project and is available on the Internet. I will make the results of the new data set public.

It would probably be good for some of you guys here to do some measurements yourself.
How can you be sure that the points in question are merely noise?
 
Easy to claim as you are also the only one to claim to be able to make sense of what you are saying.
I can make sense of it now, but I strongly believe Tony has changed the way he interprets "acceleration" in order to cover up his previous mistake, rather than admit error. He seems to be a bit sensitive about his technical savvy.
 
Easy to claim as you are also the only one to claim to be able to make sense of what you are saying.

You don't seem to understand and it would be best if you continued to use figures to express your argument as you one time earlier.
 
How can you be sure that the points in question are merely noise?

That is what I believe, and I did explain that to prove it isn't one has to answer the question of how the average about the data point in question can be higher than the previous average.

However, I also said we will be taking a new data set with a more sophisticated system.
 
Originally Posted by D'rok
This is about as straightforward and clear cut as can be.

Response, Tony? It's right there in the table on page 7 of your paper.

.
That appears to be true.

I'll take a look at it.
.
<snip>
However, I also said we will be taking a new data set with a more sophisticated system.


.
Dewey/Truman Pre-pared Headlines
...

Szamboti Admits Error - Now Debunker All Is Forgiven.
Ryan And Tony Trade Beers At Hooters

...

Szamboti Complains His Original Data Desn't Prove Premise Therefore Data Wrong
Promises To Torture New Data Until It Does

All Is Normal
As You Were.
 
Last edited:
.

.


.
Dewey/Truman Pre-pared Headlines
...

Szamboti Admits Error - Now Debunker All Is Forgiven.
Ryan And Tony Trade Beers At Hooters

...

Szamboti Complains His Original Data Desn't Prove Premise Therefore Data Wrong
Promises To Torture New Data Until It Does

All Is Normal
As You Were.

Basque, you forgot this part.

I am sure some here will say that the measurement resolution is not sufficient to discern whether or not a jolt took place, but that is a feigned argument for two reasons: First is that the trend is obviously increasing, and secondly the size of the jolt required is much higher than what could be indicated by one data point.

The premise of the Missing Jolt paper is valid for the reasons stated above.


Another important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.
 
Last edited:
Originally Posted by D'rok
This is about as straightforward and clear cut as can be.

Response, Tony? It's right there in the table on page 7 of your paper.

.
That appears to be true.

I'll take a look at it.
.
<snip>
However, I also said we will be taking a new data set with a more sophisticated system.
.
Basque, you forgot this part.

I am sure some here will say that the measurement resolution is not sufficient to discern whether or not a jolt took place, but that is a feigned argument for two reasons: First is that the trend is obviously increasing, and secondly the size of the jolt required is much higher than what could be indicated by one data point.

The important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.

The premise of the Missing Jolt paper is valid for the reasons stated above.

.
Dewey/Truman Pre-pared Headlines
...

Szamboti Admits Error - Now Debunker All Is Forgiven.
Ryan And Tony Trade Beers At Hooters

...

Szamboti Complains His Original Data Desn't Prove Premise Therefore Data Wrong
Promises To Torture New Data Until It Does

All Is Normal
As You Were.
 
Another important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.


.
I'm disappointed you don't read all my posts.
.

My bolding


Wrong -TS’ own data (W.D.Clinger’s velocity vs time chart) shows this negative slope therefore deceleration occurred (assuming TS data) therefore the Missing Jolt paper based on no deceleration is wrong.


If there is a loss of velocity in the Verinage demolitions it is because unlike at the Towers, the falling loads of the block above are equally distributed onto the block below . And TS knows this.

.
Wrong – Balzac-Vitry was not a column and beam structure. It was a precast concrete loadbearing wall structure.
.
Balzac -Vitry Tower
[qimg]http://www.cg94.fr/files/diaporama/11190/11199p.jpg[/qimg]

Tannen Towers
“Architectural precast concrete wall panels that act as loadbearing
elements in a building are both a structurally efficient and
economical means of transferring floor and roof loads through the
structure and into the foundation. …

The 32-story Tannen Towers condominium project in Atlantic City, New
Jersey, completed in 1987 uses portal frames at the base, and bearing walls
in the upper levels (see Fig. 36). The building is subdivided from top to bottom by a central corridor. A row of 37 ft (11.3 m) long bearing walls, which are typically 8 in. (203 mm) thick, runs along either side of the corridor.
The walls cantilever 11 ft (3.35 m) beyond the face of the base structure on
both sides of the building. To stabilize the structure, the design links pairs of
bearing walls across the corridor with steel ties back-to-back angles rein -
forced with a continuous plate.
The entire structure was built using precast, prestressed concrete hollow core slabs, balcony slabs, precast load bearing walls, stairs and landings.”

http://www.pci.org/view_file.cfm?file=PR-24.PDF


Another example of a concrete loadbearing wall structure
[qimg]http://www.cpci.ca/images/sectionpics/potm/22005/1.jpg[/qimg]
 
Another important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.

Verinage was specifically engineered to exhibit the perfect limiting case of an aligned column to column impact, a "luxury" that none of the towers had.

Considering you've made it abundantly clear you believe no such tilt exists with tower 1, and you want to avoid WTC 2 altogether I'll be very surprised if it ever sinks in...
 
That is what I believe...
That's a little underwhelming

...and I did explain that to prove it isn't one has to answer the question of how the average about the data point in question can be higher than the previous average.

However, I also said we will be taking a new data set with a more sophisticated system.

Sounds like you're throwing your own data under the bus rather than re-consider your conclusion. Actually, it sounds like your conclusion is pre-determined rather than arrived at.
 
That's a little underwhelming



Sounds like you're throwing your own data under the bus rather than re-consider your conclusion. Actually, it sounds like your conclusion is pre-determined rather than arrived at.

Hardly.
 
Verinage was specifically engineered to exhibit the perfect limiting case of an aligned column to column impact, a "luxury" that none of the towers had.

Considering you've made it abundantly clear you believe no such tilt exists with tower 1, and you want to avoid WTC 2 altogether I'll be very surprised if it ever sinks in...

Take a real hard look at the Balzac-Vitry demolition. The whole upper section shifts to the side by several feet, so it can't be having perfect column on column impact, yet it has a serious deceleration.

The reality that a deceleration should occur even without perfect column alignment is overwhelming. The fact that it doesn't in WTC 1 is overwhelming proof that the strength of the structure below was largely removed before impact.
 
Last edited:
Thank you, Tony, for acknowledging that symmetric differencing is a form of data smoothing. Although you still have not acknowledged that your raw unsmoothed data show an actual decrease in velocity, I guess we should thank you for not repeating your usual denial of that fact.

Symmetric differencing does generate an average about a data point and thus will smooth out noise.
Yes, and it also smooths out signal.

Using every data point by simple differencing essentially doubles the noise.
Yes, and it also improves the resolution by a factor of two (compared to symmetric differencing).

The question one needs to ask is why the average between two data points on either side of the point in question causes a higher velocity than the previous average. How did it get to be greater?
I see two possible interpretations of this question.

You might be asking what could cause the downward velocity of a collapsing structure to increase with time, in which case the answer is gravityWP.

On the other hand, you might be asking us to explain to you why velocities calculated by balanced (symmetric) differencing aren't the same as those calculated by simple forward or backward differencing, in which case the answer is that balanced differencing is a form of data smoothing that effectively degrades the resolution of your data by a factor of two.

Both interpretations of your question leave me to wonder whether your paper's first author might have been responsible for the technical aspects of your paper, with the second author brought in to handle the religious aspects.

This is the reason for regression analysis and when charting data like this it is the trend which is significant.
Nonsense. Everyone knows the trend (average acceleration) is about 0.7g. If that 0.7g trend were the only significant issue, we wouldn't be having this conversation.

The central question of your paper, silly though it be, is whether that 0.7g average is (1) a fairly smooth acceleration, as would be expected from the smeared-out collision caused by a tilt, or (2) the smoothed-out average of a jerky, jolting acceleration, as expected by MacQueen and Szamboti.

I am sure some here will say that the measurement resolution is not sufficient to discern whether or not a jolt took place,
Yes, there are competent people here. One of them (Dave Rogers) has analyzed the situation correctly.

but that is a feigned argument for two reasons: First is that the trend is obviously increasing, and secondly the size of the jolt required is much higher than what could be indicated by one data point.
(Insert laughing dogs here.)

"first is that the trend is obviously increasing"
The object is being accelerated downward by gravity. No one in his right mind would expect the smoothed trend of the downward component of its velocity vector to be anything other than increasing.

"the size of the jolt required is much higher than what could be indicated by one data point."
A single data point measures position at a single point in time. It can't show anything about velocity. To say anything at all about velocity, you have to look at two or more data points. Even two data points can't show anything about acceleration. To say anything at all about acceleration, you have to look at three or more data points.

Finally, and most important: Trends and averages can't show anything about the presence or absence of jolts. In this context, jolts are brief decreases in the downward component of the instantaneous acceleration vector, and cannot be estimated without estimates of the downward component of the instantaneous velocity vector. Trends and averages say nothing about brief changes in instantaneous velocity or instantaneous acceleration.

If you look at MacQueen and Szamboti's three data points for position at times 1.5, 1.67, and 1.83 seconds, they show a decrease in velocity: this is exactly the kind of jolt that MacQueen and Szamboti say is missing from their data. That's an epic fail.

The premise of the Missing Jolt paper is valid for the reasons stated above.
As I have explained above, your "reasons stated above" are laughable.

I can't just accept your claim on the basis of your technical authority, either. For at least four months now, you have been having obvious difficulties with grade school arithmetic.

In any case, we will be redoing the measurements with a more sophisticated system called Tracker, which is in the Open Source Physics project and is available on the Internet. I will make the results of the new data set public.

It would probably be good for some of you guys here to do some measurements yourself.
You and MacQueen made a truly extraordinary claim that's contradicted by your own raw data. It would have been a good idea for you to have realized that before publishing your paper instead of after.

Competent peer review would have helped.
 
Last edited:
The reality that a deceleration should occur even without perfect column alignment is overwhelming. The fact that it doesn't in WTC 1 is overwhelming proof that the strength of the structure below was largely removed before impact.

The Balzac-Vitry building was more robustly built than the towers. So the deceleration should be far greater than observed in thew WTC towers. I'll let some number-cruncher work on how much. Then you can compare notes.

You handwave too many obvious diffeerences.
 
The graph in post #272 is not an accurate representation of the data in the Missing Jolt paper.

Yes it is. It is an accurate representation of your raw data before smoothing.

There are no velocity values in the table on page 8 which are less than the one that preceded it in tiime, so there is never a loss in velocity and thus there would be no negative slope in a velocity curve made from that data.

The values in table 8 are derived from the raw data by applying a two-point smoothing algorithm to the raw data, which is why there are no negative slopes in your velocity curve. Your manipulation of the data has removed the phenomenon you were trying to prove was not present, which is convenient to say the least. You're either incompetent or a liar; there is no third alternative here.

You still never answered my question on whether you have a job other than replying to posts on this forum.

The forum rules forbid me to advise you on where you can stick your personal questions.

Dave
 
Another important thing that nobody here has even attempted to address is why measurements on all of the Verinage demolitions show a very pronounced deceleration for much more than one data point and yet we don't see that with the WTC 1 measurements.

That is an outright, barefaced lie. You've been told several times that the verinage structures are completely different in construction to the WTC towers, and more importantly that the structure in a verinage demolition is symmetrically removed so that the upper block falls without a tilt. There is very much more area for the vertical structures to collide directly, the fall is vertical so that the impact is axial, and the impact is simultaneous across the entire structure. The absence of all three of these features in the WTC collapses results in either the absence of, or a much smaller jolt; the clearly visible 2G jolt in your raw data that you've removed by data smoothing is easily large enough.

Dave
 
This is the reason for regression analysis and when charting data like this it is the trend which is significant.

Good god, I didn't realise it was possible for you to come up with a more moronic excuse. You're looking for discontinuities in the data, and you're using regression analysis to look for overall trends? If I even have to explain why that's idiotic, you're probably incapable of understanding the answer.

Dave
 
That is what I believe, and I did explain that to prove it isn't one has to answer the question of how the average about the data point in question can be higher than the previous average.

It's simple arithmetic. The discrepancy between the raw and the smoothed data comes about because of the smoothing procedure you're applying. It should be obvious that a smoothing algorithm removes spurious peaks and troughs in data; that is, after all, exactly what a smoothing algorithm is supposed to do. Unfortunately, it will just as easily remove significant peaks and troughs in data. Since you're looking for a significant trough, you shouldn't be smoothing the data. If your data is noisy enough that you need to smooth it, then it's too noisy to detect the presence or absence of the feature you're looking for.

This is first year undergraduate stuff, if that.

Dave
 
It's simple arithmetic. The discrepancy between the raw and the smoothed data comes about because of the smoothing procedure you're applying. It should be obvious that a smoothing algorithm removes spurious peaks and troughs in data; that is, after all, exactly what a smoothing algorithm is supposed to do. Unfortunately, it will just as easily remove significant peaks and troughs in data. Since you're looking for a significant trough, you shouldn't be smoothing the data. If your data is noisy enough that you need to smooth it, then it's too noisy to detect the presence or absence of the feature you're looking for.

This is first year undergraduate stuff, if that.

Dave

The size of the discontinuity if WTC 1 had collapsed naturally would be quite large and it certainly shows itself in the Verinage demolitions.

The large deceleration seen in all of the Verinage demolitions with the same measurement technique completely refutes your argument here that we are smoothing out any real deceleration.

You are forced to argue that we are missing the discontinuity, yet you haven't taken any measurements yourself. The reasons for that is that there was no deceleration in the WTC 1 collapse, you have no real argument, and would prove yourself wrong.

Dave, it really is important to your credibility that you tell us what other work you do besides posting on this forum.
 
Last edited:
The size of the discontinuity if WTC 1 had collapsed naturally would be quite large and it certainly shows itself in the Verinage demolitions.

The large deceleration seen in all of the Verinage demolitions with the same measurement technique completely refutes your argument here that we are smoothing out any real deceleration.

You are forced to argue that we are missing the discontinuity, yet you haven't taken any measurements yourself. The reasons for that is that there was no deceleration in the WTC 1 collapse, you have no real argument, and would prove yourself wrong.

Dave, it really is important to your credibility that you tell us what other work you do besides posting on this forum.

How many high-rise buildings have you designed?
 
balanced differencing hides the jolt

That is what I believe, and I did explain that to prove it isn't one has to answer the question of how the average about the data point in question can be higher than the previous average.
It's simple arithmetic. The discrepancy between the raw and the smoothed data comes about because of the smoothing procedure you're applying. It should be obvious that a smoothing algorithm removes spurious peaks and troughs in data; that is, after all, exactly what a smoothing algorithm is supposed to do. Unfortunately, it will just as easily remove significant peaks and troughs in data. Since you're looking for a significant trough, you shouldn't be smoothing the data. If your data is noisy enough that you need to smooth it, then it's too noisy to detect the presence or absence of the feature you're looking for.

This is first year undergraduate stuff, if that.


Because this is first year undergraduate stuff, it's easy to explain using a simple example. Suppose we have sampled the vertical position s of a particle or other object at times t=0, 1, 2, and so on. Suppose further that there is some controversy as to whether its velocity ds/dt ever goes negative. (For the purposes of this thread, we might think of a negative instantaneous velocity as a "jolt".)

Suppose further that, unbeknownst to us, the true position of the particle is given by s=1+t+cos(pi*t), which means the velocity really does go negative (there really are jolts, and they occur at regular intervals). As it happens, however, the resolution of our data is just barely good enough to reveal those jolts because we sampled the position at its Nyquist rateWP. The true value of the position is as follows, with our sampled values shown by the plus marks:

picture.php


The true velocity of this object ranges from a little under -2 to a little more than +4 (from 1-pi to 1+pi). If we calculate the velocity using simple backward differencing, we'll see the velocity alternating between -1 and +3. If we calculate the velocity using balanced (symmetric) differencing as was done by MacQueen and Szamboti, we'll see only the general trend, which is a constant velocity of +1:

picture.php


As can be seen from this example, simple differencing tends to underestimate the extremal values of the instantaneous velocity, but it provides a far more accurate picture of the instantaneous velocity than we'd get from balanced differencing.

If we were to rely on balanced differencing only, as advocated by Tony Szamboti, then we would conclude that the object's velocity never goes negative, and there is no jolt. That conclusion would be false; the smoothing performed by balanced differencing would have led us astray.
 
Last edited:
And Tony's house of cards falls faster than freefall.

The charlatan has been exposed again.
 
When I first encountered the JOLT paper I wondered how anyone could ever prove (to me) how much detail they could see in the video. I saw no full description of the lens, camera, type of recording, etc. which made me immediately skeptical of any derived values. I decided that it was fair to ask anyone show me a moving small distinct object in the video whose size is known to be (say) one foot which seems to be the resolution claimed. Without that visual proof, just dividing some number pixels by some other number is just a supposition. What seemed to be going on in the paper was that Tony was assuming perfect resolution - that the one pixel value MUST be the resolving power of the video. Comments, as I like to say, welcome.
Rgrds-Ross
 
I am not sure how accurately results you can get from counting pickles on a video. What I am sure of is that there are professionals who are very sensitive about their professional reputation, and are real careful when making statements.

Try guess who are who on this subforum.
 
I am still puzzled about the concept of a "jolt." As I get it, there should have been a massive deceleration of the upper block of the towers as they contacted the first unbroken floors or columns.

I would submit that, since the mechanism of collapse had nothing to do with vertical comprtession of any but a few core columns, this is not as big a deal as Tony wants us to think it.

I get the impression, though, they we do see a manifestation of the "jolt" when the core columns collided in the "kink" near the top of the south tower.

My impression is that the collapse began with the whole upper block moving in one direction, slightly off of a straight perpendicular drop. The kink may have developed when core-column-to-core-column contact arrested movement in that direction, and the force of gravity pulling down on the hat truss was enough to overcome the momentum of the rest of the structure that was pulling it toward the other side of the building.

Sorry I can't put it into engineers' jargon. I'm in this as a fire figfhter and sometimes construction laborer.
 
I think of bazant's calculations giving a factor 10 off, under perfect conditions, and compare that to columns hitting floors. A jolt does not sound too likely to me, but as an electrician my opinion have little backup.
 
No it isn't. I'm pointing out some very obvious faults in your data analysis, not appealing to my own authority. And the deceleration is still visible in your own data.

Dave

If the one point you speak of, at 1.834 seconds, was a real deceleration, then the distance traveled of the data point at 2.000 seconds would have been less because of it. This would have caused the average velocity between the points on either side of 1.834 to be less than the average velocity between the two points on either side of the 1.667 second data point before it. But it isn't, and this is why it cannot be considered a real deceleration.

You haven't even tried to explain that.

As I have said, the same measurement technique picks up the very pronounced decelerations in every one of the verinage demolitions, where a real gravity only caused collapse is known to be occurring. These decelerations are also observed over a number of data points, which is what would have happened in WTC 1 if it had a real deceleration.

You are obviously desperate and willing to pick on some small measurement noise to try and find any way of saying there was a possibility of the collapse of WTC 1 being naturally caused. I am actually sorry to say this lack of real deceleration proves it could not have been.
 
Last edited:
If the one point you speak of, at 1.834 seconds, was a real deceleration, then the distance traveled of the data point at 2.000 seconds would have been less because of it. This would have caused the average velocity between the points on either side of 1.834 to be less than the average velocity between the two points on either side of the 1.667 second data point before it. But it isn't, and this is why it cannot be considered a real deceleration.

You haven't even tried to explain that.

As I have said, the same measurement technique picks up the very pronounced decelerations in every one of the verinage demolitions, where a real gravity only caused collapse is known to be occurring. These decelerations are also observed over a number of data points, which is what would have happened in WTC 1 if it had a real deceleration.

You are obviously desperate and willing to pick on some small measurement noise to try and find any way of saying there was a possibility of the collapse of WTC 1 being naturally caused. I am actually sorry to say this lack of real deceleration proves it could not have been.

Considering what Dave already posted regarding the magnitude of uncertainty of your measurements, I'm not sure this is picking on "some small measurement of noise." Basically, the quality of your data sucks, and it is certainly not sufficient for you to claim that it rules out a deceleration.
 
Considering what Dave already posted regarding the magnitude of uncertainty of your measurements, I'm not sure this is picking on "some small measurement of noise." Basically, the quality of your data sucks, and it is certainly not sufficient for you to claim that it rules out a deceleration.

As I said, the average about the 1.834 second data point would have to be less than the average about 1.667 seconds before it, if the point at 1.834 was a real deceleration. The fact that it isn't proves there was not a real deceleration there.
 
Last edited:
As I said, the next average would have to be less than the average before it, if the point at 1.834 was a real deceleration. The fact that it isn't proves it is not.

Ah, but you're calculating those averages without first knowing the uncertainty of the original data, other than it shows at least a 1g error, which is about +/- 5 fps at your sampling rate, which is huge compared to the difference between your average velocities.
 
Ah, but you're calculating those averages without first knowing the uncertainty of the original data, other than it shows at least a 1g error, which is about +/- 5 fps at your sampling rate, which is huge compared to the difference between your average velocities.

You are somehow skipping several steps when you jump to working in your alleged 1g error with measuring distance in feet, which must come from your struggle to understand what constitutes a deceleration.

It seems you have been trying too hard to refute something that unfortunately is irrefutable.
 
Last edited:
Wrong.

You are somehow skipping several steps when you jump to working in g's with measuring distance in feet, which must come from your struggle to understand what constitutes a deceleration.

It seems you have been trying too hard to refute something that unfortunately is unrefutable.

Wrong several times again, Tony. Consistency is certainly your strong point.
 
Tony,

Why are you still arguing about whether your data contains acceleration rate reduction ?

You know that myself, OWE and Achimspok have all generated trace data from the same footage which is of a much higher accuracy and resolution.

You also know that data does indeed contain what I would term *mini jolts*.

I'm aware that you think the jolt magnitude(s) should be larger.

You know the upper block wasn't rigid, that there is no direct reason to think that *jolts* in the core would be transmitted such that those *jolts* would be of large magnitude at the NW corner, and that recent FEA analyses show a much lower *jolt* magnitude the further from the point of impact you look at.

I have absolutely no idea why you are not using the higher resolution data in trying to state your case...
378476413.png

...you have access to the raw data after all.

I definitely think that the lack of upper block rigidity is something you should look into further in terms of progressing your viewpoint.
 
If the one point you speak of, at 1.834 seconds, was a real deceleration, then the distance traveled of the data point at 2.000 seconds would have been less because of it. This would have caused the average velocity between the points on either side of 1.834 to be less than the average velocity between the two points on either side of the 1.667 second data point before it. But it isn't, and this is why it cannot be considered a real deceleration.

Incoherent rubbish. You're trying to claim that any isolated minimum in the data cannot be a real data point, and then concluding that there are no isolated minima in the data. You're specifically excluding from consideration the exact phenomena you claim to be looking for.

Dave
 
Incoherent rubbish. You're trying to claim that any isolated minimum in the data cannot be a real data point, and then concluding that there are no isolated minima in the data. You're specifically excluding from consideration the exact phenomena you claim to be looking for.

Dave

The original measurement data in the Missing Jolt paper was taken by hand using a pixel measuring tool called Screen Calipers.

We retook the data last night with a much more sophisticated and automated tool called Tracker, which is meant for just this sort of thing and locks onto the feature to be measured. These measurements show the distance traveled between 1.667 and 1.834 seconds into the fall of WTC 1 is greater, not less, than the distance traveled between 1.500 and 1.667 seconds into the fall.

So it was in fact noise in the hand data, probably caused by not being precisely locked on the point being measured for each measurement.
 
Last edited:
The original measurement data in the Missing Jolt paper was taken by hand using a pixel measuring tool called Screen Calipers.

We retook the data last night with a much more sophisticated and automated tool called Tracker, which is meant for just this sort of thing and locks onto the feature to be measured. These measurements show the distance traveled between 1.667 and 1.834 seconds into the fall of WTC 1 is greater, not less, than the distance traveled between 1.500 and 1.667 seconds into the fall.

So it was in fact noise in the hand data, probably caused by not being precisely locked on the point being measured for each measurement.

I'm not in the least surprised; as I've said all along, any tilt of the upper block eliminates the possibility of a significant jolt on impact, which is of course the key difference between the WTC collapses and a verinage demolition. However, the data you've published in the paper still contains a feature which your conclusion incorrectly asserts is not present, even though it's simple a noise artefact. In the circumstances, I would suggest that the minimum responsible action right now would be to withdraw your paper from the Journal of 9/11 Studies immediately, pending a re-write.

Dave
 

Back
Top Bottom